Monday, March 7, 2016

Cleaning up after aws cli on Mac OSX...

I've installed the aws command line on my Mac. It's super handy. However, the aws s3 command creates $folder$ files for every "directory" when a recursive copy is performed. It's super annoying. 

For example, you could have a "directory" in S3 named "myfiles". When you download the objects with "myfiles" in the path you will end up with a file named "myfiles_$folder$".

Running aws --version returns this info:

    aws-cli/1.10.6 Python/2.7.10 Darwin/14.5.0 botocore/1.3.28


I haven't found anything that explains how I can prevent those files from being created, so I've been doing manual cleanup afterwards.  This is the command I run:

    > rm $(find . "*$folder$")



21 comments:

  1. This is so essential post. This information helps them who are new bloggers. Thanks for helpful post for us. check it out

    ReplyDelete
  2. I don’t waste my free time that’s why I read the informative things when I got this blog I really enjoyed reading this.
    shampoo carpet cleaner westchester

    ReplyDelete
  3. I am happy to find your distinguished way of writing the post. Now you make it easy for me to understand and implement the concept. Thank you for the post. junk removal miami beach

    ReplyDelete
  4. I encourage you to read this text it is fun described ...
    Carpet Cleaning Fort Lauderdale

    ReplyDelete
  5. I have read your blog it is very helpful for me. I want to say thanks to you. I have bookmark your site for future updates. thetopcleaner.com

    ReplyDelete
  6. Naturally there are specific groups and lines of business that require advanced forms of sanitation and clean up service. Some examples of specialized cleaning include floor waxing, high speed polishing, blood and biohazard disposal, autoclaving and clinical irradiation.air duct cleaning ann arbor mi

    ReplyDelete
  7. No matter how good a Mac is, there will come a time when it gets sluggish. When this happens, you will naturally want to find an effective way of fixing a slow Mac. scanning on Mac

    ReplyDelete
  8. Thanks a lot for sharing this amazing knowledge with us. This site is fantastic. I always find great knowledge from it.

    Commercial Cleaning Services Montreal

    ReplyDelete
  9. Me and my companions have completely delighted in this blog.
    vacuum cleaner under 200

    ReplyDelete
  10. I have perused your online journal it is exceptionally useful for me. I need to express profound gratitude to you. I have bookmark your site for future redesigns.  Las vegas carpet cleaning companies

    ReplyDelete
  11. Nice information, valuable and excellent design, as share good stuff with good ideas and concepts, lots of great information and inspiration, both of which I need, thanks to offer such a helpful information here. cleaning services of colorado

    ReplyDelete
  12. This is very educational content and written well for a change. It's nice to see that some people still understand how to write a quality post. laboratory equipments

    ReplyDelete
  13. Great post. It’s a fact that your blog posts are so unique and interesting. I have always admired your site. Thanks for the great tips and work. glass fiber filters

    ReplyDelete
  14. Thanks for sharing nice information with us. I like your post and all you share with us is up-to-date and quite informative. Best Fertility Centre In Hyderabad

    ReplyDelete
  15. Great info! I recently came across your blog and have been reading along. I thought I would leave my first comment. I don’t know what to say except that I have.
    8 Pool Unlimited Coins

    ReplyDelete
  16. Thank you for sharing of some of the articles we read this one article is very interesting for us, I like it. pest control services in delhi

    ReplyDelete
  17. Very well written post. You have shared a wonderful article which is very helpful. Thanks for sharing, have a good day.
    best hospitals in Hyderabad

    ReplyDelete