I’ve been interested in getting a dashcam for my car(s) for a while, and when Woot had a had an Armorall Dashcam up for $29, I decided to give it a try. The verdict? This thing is worth about as much as I paid for it. Yes, it records video and audio. Yes it datestamps it. Yes it turns on and off automatically when I start the car, and yes it loops data on the MicroSD card just fine. Other than that, I can’t find a lot to recommend it. The field of view is narrow, the interface is painfully obtuse, and the mirror mount is… weird. Works, but weird.
Anyway. What did happen though was while driving to work this morning, a trailer in front of me had a tire blowout. The sound was funny, a squeaky whoosh (as opposed to a bang), and I missed seeing it happen (it’s at 7:53:17 in the video). I only understood what was going on when I saw smoke and bits of tire flying off the trailer. I backed off and put my my 4-way flashers for folks behind me, and let the guy pull over to the side. The wheel was pretty wrecked, and the sparks and tire debris were pretty dramatic.
If this had actually damaged my car or something more serious had happened, having the recording to show my insurance company and/or police would have been a huge win. But that’s what dashcams are all about, right?
Remember a couple days ago Iposted about using tethering with my Moto X phone on vacation? Just a brief reminder that it’s always a good idea to remember you are tethered and “on the meter” when things get rainy and quiet, and you decide to do a little Youtubing.
For the longest time I was stuck in a weird no-mans land regarding WiFi Tethering on my cell phone. I’m referring here to the practice of enabling a hot spot on the phone so other devices, such as a laptop, can share the data connection the phone is using. This is super-handy when in an area that either has no Wifi service, or the service is sketchy as hell.
Problem is, I had an Unlimited data plan with AT&T. And with that unlimited plan, hotspot service was not available.
A year or two ago I made the change and moved my data service to a family plan with a shared data pool. 5 gig a month spread over 4 phones. We haven’t come anywhere near that limit, even with some heavy duty usage, so all in all, a good choice. What I forgot though, was that by going to a metered billing structure, I was able to start using tethered mode.
My cell phone is a Moto X, aka a Moto X Pure Style. I’m deliriously happy with it, so setting it up as a Wifi Hotspot would just bring it to a new level of functionality.
Enabling it was easy. I was initially worried about performance, but after connecting to it with my laptop and running Speedtest, the numbers are pretty good.
Working from my laptop over the tethered connection is just like sitting at home. I’ll need to set up a better “show how much data I’m using” mechanism, but right now, this is pretty cool.
This one came up while working on my home network / photo management setup. I’ve set my Synology DS216+ NAS to use Cloud Sync to back up my files to an Amazon S3 bucket (see this post for some more information on using S3 for backups). The problem was it was taking a very long time, and I needed to figure out how much had transferred.
Unfortunately, Amazon has no simple mechanism for determining the size of an S3 bucket. I found a couple posts on StackOverflow showing how to do it, but they seemed overly complex.
While you can get a bucket size using several third party GUI tools, the command line approach is quick and easy. It does require the Amazon Command line Tools to be installed, and access keys generated, but once that’s done, you can quickly query Amazon for just about anything.
Here’s the command I used to determine the size of my bucket. This is on a mac:
Face it, I’ve been on the net a long time. Usually by the time some buzzword I’ve heard gets enough attention that I check into it, it turns out the hype doesn’t even remotely match the reality. And thus it’s been for me with Podcasts. You can’t swing a dead cat without hearing someone say “And subscribe to our podcast!!!”
Maybe it’s just my early trauma dealing with downloaded files / iTunes syncing problems / PalmPilot lack of audio, whatever, but I never listened to podcasts, even when a friend would say “Hey, did you hear that podcast by Bubbitah Bingah? Dude was awesome.” “Uh huh, what’s the link to the text of the article again?”
Well, even old dogs can learn new tricks. I’m into the third year of dealing with a 40 minute commute, all along highways, with nothing but XM radio or NPR to listen to, I decided to finally take the plunge and check out podcasts. Herein lies what I’ve learned.
Get a good a podcast app
First things first. You need a way to listen to podcasts. For me the important mechanism would be something that allows for downloads ahead of time (say, over Wifi), and then I could catch up as the week went along. I’m fully wedded to Android now, happily using my Motorola Moto X for gaming, music, mail, calendaring – heck, everything. So a decent Android app was needed. I settled on Podcast Addict. It’s a great app that categorizes all my subscriptions, and lets me download all or some of the episodes ahead of time.
Now all I needed was content
So, what do you want to listen to?
There’s seriously no shortage of podcasts out there. And, frankly, most of them suck. When you have no need to limit your time to a 10 minute slot on a radio show, and can blather on for an hour and a half about navel lint, the field gets crowded pretty quickly.
I recommend starting with things you know – being an NPR addict, these were easy:
Wait Wait Don’t Tell Me
This American Life
But after that, we start getting into things that are related, but you don’t get to hear quite as often.
Ted Radio Hour
And then we get into fun stuff. Up until this point, all of that stuff I’d hear on the radio from time to time (either on NPR or on Public Radio Remix – a station I highly recommend, btw). But what about independent stuff? This is what I came to share with you.
First, I have to highly highly recommend Our Fake History. This podcast focuses on deep dives into historical myths, legends, and stories, and digs out what parts of those stories are true and what has been embellished over time. I got completely sucked into their first ‘big’ series, “Was There Really a Trojan War”. I learned more about the Iliad, Greek mythology, and 19th century archeology than I had ever known before. The current series is going into Helen of Troy, and it’s equally fascinating. Highly recommended!
Following along after is Lore. In a similar vein, this podcast talks about history with a sort of dark bent. Vampires, missing persons – where did all these stories come from? It has a darker, more ‘sitting around the fireplace telling stories’ feel, but all of it is well researched and detailed.
Moving off the dark history bit, I also listen to the Petapixel podcast. This series follows the website pretty closely, but has extra commentary and thoughts by Mike “Sharky” James. Great stuff.
One last shout out. My friend Tim pointed me to Welcome to Night Vale. This is the fictional broadcast of a public service radio station in the town of Night Vale. Think of it as a sort of Prairie Home Companion meets HP Lovecraft. It has it’s ups and downs, but has some great moments in it.
So, if you’ve ever thought about fiddling around with podcasts, I recommend getting Podcast Addict (it’s free), anv giving these podcasts a try. There’s lots to be learned.
When I started doing semi-professional photography a few years ago, I knew that I’d need to step up my game when it comes to photo management, processing, long term archiving, and, of course, the ever neglected marketing. Some of these I had a CLOO about, others were rocky roads of experimentation, research, and late night frustrations.
After a lot of research, blog-reading, chatting, and hard decision making, I think I’ve boiled things down to a workable, relatively elegant, yet flexible environment. I present here the results of two years of “How the HECK am I going to do this???”
This article is primarily about my infrastructure, e.g. the components I’m using, how they interact with each other, and some of the lessons I’ve learned. A full walkthrough of my actual photo process will come in a later post, so for this installment, lets look at the players…
Adobe Lightroom CC
Love it or hate it, Lightroom is the undisputed champeen in the photo management world. People can argue one way or another about whether Lightroom is One True Photo Tool, but lets face facts. They own the space right now. Sure there are issues with speed, and Adobe isn’t exactly the warmest and fuzziest company on the planet, but Lightroom is the best supported, most actively used, and best known of all the options. Coupled with Photoshop and other toolsets, it’s hard to make an argument against it.
Apple Macbook Pro
I love my Mac. You hear that a lot, and you’ll also hear the detractors going on about Mac Fanbois and all that hoohah. When it comes down to brass tacks, you can’t beat a Mac for fostering the kind of creative environment needed for artistic work. And let’s not beat around the bush. Photographers are artists. Our tools should enable us to create and share images we see through our viewfinders and in our minds. You can’t do that when you’re dealing with crapola environments like Windows or spending all your days tinkering with configurations in Linux just to get a youtube video to play. (Full disclosure here – I LOVE linux. I work on it every day. But it can’t hold a candle to a Mac for the fit and polish of it’s desktop environment. Srsly.)
Synology DS216+ NAS (Network Attached Storage)
Now we’re getting down to it. You can’t take take pictures in the digital age without a safe place to store them. My Mac can only hold so much data, and there’s something very iffy about storing unique, critical files on a device that you frequently toss around your living room, sling on your back, or carry on the bus. One thing I’ve always said is consider your laptop as expendable. No critical information should be on it that you absolutely cannot afford to lose permanently on a moments notice. Cuz every laptop is one “oops!” away from being run over by a car, falling in the sink, or getting stepped on. My NAS is 3 terabytes of mirrored storage (6TB total) that stays on the shelf at home. I don’t carry it on the bus, and it’s unlikely to get run over by a car. It’s fast, easy to work with, and relatively affordable.
Amazon S3 Glacier Storage
Even with a home NAS, you still need backups. And I want to underline something here. “Backups” are not just cloud-based ‘PC backups’. Many services are simply copies of your local hard drives in the cloud. If you mistakenly overwrite a local file with something wrong, or delete a local file, and your backup system runs, congratulations! You now have a backup… OF YOUR MISTAKE. The original file is now gone in both locations! Many services do allow for ‘historical’ archives, where you can retrieve a previous version of a file from the cloud, but be very careful when choosing your offsite storage environment. I use Amazon Glacier, but I understand this may not be for everyone. Glacier is a service built on top of Amazon S3, which is part of AWS. Glacier a simple upload service where files ‘settle’ into long term storage, meaning that once they’re copied to S3, they’re available immediately, but I’ve set it up so that after a month, the files are ‘archived’ into Glacier. They’re still retrievable, but it may take a few hours to get them back. Why do this? Because Glacier storage is 1/10th the price of standard S3. As of this writing, Glacier is $0.007 per GB / month. My entire photo archive is approximately 400GB, so storing this in Glacier costs me $2.80/month. If I were to use S3 in ‘standard’ mode, it would be $0.03 per GB / month, or about $12. (There’s a middle tier called ‘infrequent access’ that is $0.125 per GB / month, which works out to $5.) Regardless, these prices are VERY low, and are easily within reach of a humble photographer. My NAS allows for easy synchronizing of my raw photos directly to S3 and Glacier, so I always have an off-site copy of my photos.
Sandisk 128GB USB3 Thumbdrive
When I first got my Mac (now over 3 years ago), it came with an internal SSD drive with a whopping 250gig of storage. “PLENTY OF ROOM!” – haha. I laugh now. That’s not enough to do all the other things I do on the Mac, and also do my photos. It’s very difficult to upgrade these machines, so I had to look around for options. Initially I was carting around an external 1TB Toshiba USB3 drive, which was… ‘fine’, for a while, but extremely fragile. If the USB cable came out while working, I immediately had to do a rebuild of my Lightroom catalog, and things went pretty squirrely. Since this is, after all, a laptop, that drive was always dangling off the edge of the couch or in other precarious positions. With thumb drives getting larger (storage wise), 128gig in something literally the size of my thumbnail, that could live in the USB slot full time seemed like a good answer. So now my catalog and working photos live on the thumb drive… bye bye 1TB external!
Pulling it all Together
Now, those who have gone down this road, if you’re still reading, have probably already seen problems with how all this is supposed to work. “Nice NAS, Dave, would.. be a shame if.. you couldn’t access it all the time!” – This is, alas, a true problem.
Having oodles of disk storage at home is all fine and dandy, but that doesn’t help when you’re parked at your local Starbucks, jammin to some tunes, and want to get all creative while slurping that double-mocha latte grande moobah moobah drink thing. (Okay, I don’t spend a lot of time at Starbucks. Sue me). But the problem still stands. If you don’t have access to your photos while away from home, how can you get things done when the muse strikes?
My solution was to split my photos into “Things I’m working on now” and “Things I’m pretty much done with”. The latter lives on the NAS, and when I’m home, I plug in an ethernet cable to my Mac, and voila! High speed access to the NAS! (Note for the geeks – Yes, you can access a NAS over wifi, or even remotely over the internet. But this is not a speedy process, in particular when working with Lightroom, really large photo libraries, and photos that are 26meg a pop. Go hardwire or go home).
Initially I was concerned this approach would cause Lightroom to have kittens. It would mean a large portion of my photos would not be available when I was on the road. But I’ll give Adobe credit. They did things right.
Lightroom is essentially a database. It indexes the ‘raw’ photo files, and keeps track of all the changes that have been applied to them. If the raw files are not available, Lightroom basically goes ¯\_(ツ)_/¯ and just shows you a low resolution preview of the last time it worked with that photo. You obviously can’t do much with that, but Lightroom doesn’t seem to care that the source file is unavailable. When I get home, plug in the cable and remount the NAS, ding! I have a high resolution image to work with again.
The next good thing is that Lightroom has a decent file manager. Moving files from my Mac (which is where I import raws from SD cards) to the NAS is simply a matter of dragging and dropping the directories in the Navigator. Lightroom updates the local database to keep track of where the files are. Badabing, badaboom, the files get moved, Lightroom updates it’s database, and I’m all done.
So what actually constitutes a workflow? Well, as I said earlier, I’ll detail my post-processing steps in another article, but here’s the steps from a pure photo management perspective:
Shoot using the Canon cameras, storing RAWs on SD cards
When ready to load the photos from the shoot into Lightroom, import the photos from the SD card (using the Mac SD slot) into Lightroom, storing the photos on the 128G thumb drive.
Do whatever post-processing is needed. Photographers know this process can take a while. With the 128gig drive, I can have many sessions stored locally on the mac, and work through whatever is needed without worrying about space.
Eventually, after photos are delivered or published, I don’t need them locally anymore, so I use the Navigator to move the raw import folders over to the NAS. The files are copied over, the local database is updated automatically, and I free up a couple gig for the next shoot
The NAS, sometime in the next few hours, automatically backs up the photos up into Glacier
Does it work?
After all that, how well does it work? Turns out, pretty durned well.
It took me about 2 months to put all this together, involving a bit of trial and error. There were some tricks with network configurations that won’t affect most users, so that complicated things. I tried working with pure wifi service to the NAS, but that was too slow for words. Installing a small dedicated gigabit ethernet switch was the final step that made the whole thing useable.
I find performance with the NAS to be quite good. It’s on a par with working with a local USB3 drive. I don’t feel having my files “over there” has any real impact (other than mobility) on things. Admittedly, there’s a comfort factor knowing my files are stored on a relatively stable, mirrored server, as well as being backed up into the cloud, and the convenience factor of just plugging in my cable at home to gain full access to them really can’t be downplayed. I CAN get to my files remotely over wifi, or, if I do enough juggling, even reach them over the internet. But for sit-down, true post work, this configuration is stable, fast, and useable. I’m a fan.
It’s no secret I’ve been having a great time hanging with the folks at MakeIt Labs in Nashua, NH. Many of the projects I’ve been working on have only been possible with their help and collaboration. Not in a “here lets do this for you” sense, but in providing a community where ideas can be bounced around, coupled with a physical space with every tool a geek could ever need at hand.
I’ve unofficially become the person organizing the parts supplies. These are ranks and ranks of bins that hold everything from capacitors to stepper motors to hot glue sticks to arcade pushbuttons. Understandably, these things can easily get out of control, so constant pruning and management is sort of a requirement. I can do that!
A new set of drawers we picked up are super-handy, but they’re just empty metal boxes. About 10″x12″x4″. Nice, useful, and stackable, but we tend to store lots of little parts, so we need to be able to divvy up that space a little more. We needed something like trays that could go into the drawers (which are all about the same size), to store small parts. The tray should be easily removable (take the tray out, use some of the parts, put it back), and easy to make many of them. We have about 120 drawers that need inserts. This sounds like a job for our 80w CO2 laser!
I had done some basic work on the laser, but this would be my first ‘build from scratch’. After measuring out the drawers, I decided to make a 9″square baseplate, with 4 sides, and a single divider down the middle that could easily be picked up. I used Adobe Illustrator to set where the cuts would be (Illustrator is great primarily because drawings measured in it translate perfectly to the laser cutter. No scaling / stretching problems. When I say ‘cut something 9″x9″‘, what I get is something 9″x9″.)
I manually did all the crenelations where the pieces would fit together. A fellow maker pointed out there’s software that helps do this, but for this first runthrough, I was okay doing it by hand. The material I was using is 1/8″ acrylic sheeting. Somewhere the lab picked up a metric buttload of the stuff, so we literally had dozens of square meters to work with.
Total cutting time was about 3 minutes. The laser had no problems working with the material. After removing the pieces from the machine and taping them together, I had a mocked up tray insert! Hooray!
It wasn’t all peaches and cream. I did mess up measurements on two of the tabs, and forgot to put in a cutoff for one small extension. After assembling what I’m referring to as the ‘1.0’ version, I realize there should be some changes. The central divider should tuck under the end pieces to give it better strength (it’s slotted in on the top now), and I should make a version of this that has 3 spaces in it, not just two. Tighter tolerances on the slots are needed (I measured 1/8″, but the ablation from the laser takes off a little bit more, so the slots are wider than they need to be).
Next step will be to re-do the cuts with the supporting tabs, remove the paper from the acrylic, and glue things together. If all goes well, I’ll have a nice insertable tray, and the ability to crank out many more without much work. Going full-on production of over a hundred of these trays will require an inventory of how much acrylic we have, and a decision on if we want to just pick up a few dozen sheets of 1/8″ birch (which would negate the ‘peeling off the paper’ problem).
I’ll post when there’s an updated sample. But for now… I played with lasers, and it was awesome.
I’ve been on the lookout for a new game to put my new Moto X Pure Android through, a device that’s extremely high powered and seems perfect for games. Ever since I saw the tablet revolution taking over gaming, I’ve been hoping for a decent, realtime, immersive game that I could get behind. (Why WoW and Eve aren’t on tablets yet is beyond me).
My son Zach was a huge booster of MOBA games before they were cool. DOTA2, and later League of Legends were daily activities. I tried them off and on, but found the complexities and knowledge curve too much for casual gaming.
Many companies have claimed to make the MOBA experience enjoyable on a mobile device, but this is the first one that’s gotten me completely hooked. I’m still in casual play mode, but I’m finding it intensely enjoyable. The graphics are magnificent, the characters interesting and varied, and the gameplay is perfect. It’s a dead-on implementation of the MOBA ideals (and yes, it has last hits :).
I’ve put in a couple hours so far, getting a feel for 3 of the heroes. There’s so much more to learn – if you watch the videos on the Vainglory channel on Youtube, watch the detailed rundowns of how to play each hero. The technicalities are vast and deep, and it’s unlikely I’ll ever get to that point with more than 1-2 favorites, but I’m ecstatic that the company behind the game (awesomely named ‘SUPER EVIL MEGACORP‘), spared no expense in making the game easy to get into, but also having huge depth to it.
Yeah, I’ve been pretty focused on drone racing, but this is pretty epic.
My first ‘radio control’ experience was building a Tamiya “subaru brat” model when I was in my 20’s, and that helped later when I started building drones. These cars are a little different, but the feeling is similar. Pretty nifty stuff.
About two years ago, I re-launched this blog. Since then it’s become my primary “I gots stuff to say” mechanism. For quite a while I hoped Google Plus would reign supreme, but it’s become readily apparent that platform is buckling via “Death from a Thousand Cuts.” Google is destroying any hope it had of dethroning Facebook one feature at a time..
Realizing that, I put more effort into making Planet Geek my main sounding platform. I re-launched the site, imported all the old content into it, gave it a facelift, and started writing again. Sadly, with the most popular services not supporting RSS, just having the blog there means many people I’d like to keep in touch with simply won’t ever see the content. I needed a way to stay in touch with my friends, family, and social connections, without having to repost the same thing over and over and over again.
By far the industry leader is Facebook. I briefly considered using it as my primary soapbox, but I just can’t bring myself to subscribe to their “We will capture all the content, all the clicks, and all the users, and share none of it outside our walled garden” approach to media. The final straw is their constant tweaking of “We will only show you what we think you should see” (more rants on this in another post). So, no Facebook for me… so where should I go?
In the end, with respect to which social media platform I should settle on, I’ve chosen none of them, and all of them.
Planet-geek, running WordPress, is my go-to platform. I do 99% of my writing here, and whatever writing I have that passes for “creativity” is created using WordPress content tools. But that isn’t enough, is it? Our online social circles are fragmented and isolated. One group lives on Facebook, another lives on Livejournal, some are still on Plus, etc etc. They would never see the posts unless I manually reposted either the entire article or direct links to everything I write.
There’s no way to cover all the bases, so I’ve done the next best thing. I chose carefully where I create and publish content, but I’ve also built links that automatically share, if not the entire content, at least a notification to all the media channels I want to reach. I have to shout out to Nextscript’s SNAP tool for making this as painless as possible. SNAP (Social Network Automatic Poster) can link my blog to just about every social network out there. I’ve set up many links, and the tool works flawlessly.
But I do create content in other places. My photography needs a creative channel, and WordPress just isn’t the tool for it. So, Flickr and Instagram come into play. Wait, but sometimes I post to Twitter directly, what about that? Yeah, okay, that’s there too. Fortunately, many of these sites (unlike Facebook) allow for external notification / sharing of content. If I post a picture to Flickr, it has an automatic notification mechanism to Facebook. Instagram does the same thing. Sadly, Google Plus has none of these tools, and also has no easy API for posting content, so it tends to be the last thing updated (I need to do it by hand).
Thinking about this, I realized that my ‘communication flow’ would make a nice visual. The graphic above is a map of the public sites I use for social media / interaction. I’ve deliberately left off chat systems and email (I use IRC, Slack, Hangouts, Skype, and of course Email). For the most part, all these services notify me back via Email, so in theory, I should be able to just watch my inbox for interactions. A lot of times that doesn’t work so well. Still working on that part!
This was a fun chart to put together. It shows the results of months of tool configuration, auto-linking, loop detection (yeah, don’t set up auto-posters to one service that is auto-posting back to the original), etc.
Am I missing anything? Let me know… er, on the blog if you can. 🙂
What a scam. Shame on you Techspot. Take a look at that “Kit”. It’s the baseline Raspberry Pi, at a slightly higher, but still “in the realm of normal” price. An case / kit – well, okay, that’s helpful, though pricier than what you can find on Amazon with 5 seconds of searching… and 4 ‘courses’, at $200 each. Yes kids, they’re valuing information anyone can get with 10 seconds of googling at $200 a pop.
A recent article appeared on Petapixel regarding a Montreal photojournalist having all his photos stolen by burglars:
A photographer’s worst nightmare just happened to a well-known photographer: on Monday, Montreal-based photojournalist Jacques Nadeau returned home to find that burglars had stolen all the photos he has taken during his life and career.
CBC News reports that Nadeau, a photojournalist for the newspaper Le Devoir, walked into his home find that five of his hard drives had been stolen.
They contained an estimated 30,000 to 50,000 photos captured over the course of his 35 year photography career.
This is a terrible story, and absolutely devastating to the photographer. My heart goes out to him. But we can take a lesson from this…
Embrace the Paranoia. Always ask “What if….”
Take a look around you. At your life, at your belongings, at things you hold dear. Ask yourself “What would happen if this were lost or destroyed?” If the answer is “This is irreplaceable”, then move on to the next question “How can I protect these things in a way that makes sure they’re never lost?”
For anyone in the digital world, the answer is simple. Backups. There are myriad sites singing the song “Always do your backups!” and “Here’s how to back up your things!” I won’t go into detail here. But people should extend that idea to other things of value. Important documents. Printed photos. Artwork. That doll from your youth. Look at these things of value and be a little paranoid. “How could this be destroyed?” Some china inherited from a relative – is it on a shelf that can be knocked over easily? A doll you once cuddled as a child, perhaps putting it out of reach of the dog would be a good idea?
Yea yeah, okay. So how do YOU do it?
I’m glad you asked! This article happened to appear while I was in the middle of backing up my photo library!
Currently, I do all my photo work in Aperture. Apple has announced that this product is being end of lifed, so no matter what, I’ll need to do a bunch of work migrating photos. I keep my photo library on an external 1TB USB3 drive, and I’m acutely aware of how fragile that is. Hard drives fail constantly, and having all my eggs in one basket is never a good idea. The challenge is, photo libraries are BIG. Hundreds of gigabytes of data. If I were to try to back up my Aperture library onto DVD+R DS (the largest ‘consumer level’ long term storage medium available at 17G per disc), I’d need 31 some odd discs. That’s too many, and cumbersome as heck to work with.
I considered Dropbox, Box.net, Google drive, and Amazon Drive, but I feel these are targeted at a desktop user who just wants a drive out in the cloud. While I use Dropbox extensively for making photos available to customers, it’s sync mechanism is quite tricky if what you’re storing on Dropbox is much larger than what you can store locally. I’m also not confident these systems will last, unchanged and accessible, for the long term. Google, in particular, has a dreadful record for keeping products and offerings available for the long run.
In the end I decided to use a pretty technical solution: Amazon S3 storage.
Backing up to S3 and Glacier
Amazon has a bulk storage system called S3, coupled with a ‘long term storage’ system called Glacier. S3 is in essence a big storage bucket where you can drop files and retrieve them at will. Glacier allows you to take S3 elements and put them in, as you might guess from the name, ‘Cold storage’. The costs for S3 storage is extremely low ($0.0240 per GB per month, or for my 600G of photo data, about $14/mo). If I move those files into Glacier, it drops to $6/mo. The difference is that restoring data from Glacier may not be immediate – it may take a few hours for your files to be available. For this sort of long term storage, that’s fine by me!
This is not as cheap as current offerings from Amazon Prime (Unlimited storage free with Prime and Amazon Drive). But I’m still very skeptical of the ‘drive’ offerings from the big players. Everyone is trying to get into the “cloud drive” market with custom clients and apps. My storage needs are exceedingly simple. About 300 very large files (copies of each of my photo projects). S3 is extremely well established, and used widely in the industry.
With S3, to back up my library, I go through these (for me) straightforward steps:
In Aperture, I select a project, and say “Export to library”. I locate that library on my external drive. This is an exact copy of my original masters / RAW images, as well as all the ‘versions’ I may have created (all in JPG form). It’s also including metadata and Aperture edit notes. While I know Aperture is not long for the world, I at least have things backed up. This results in a directory that contains a mini ‘apilibrary’ containing all my files.
From the command line, I make a ‘tgz’ of that directory. This compresses the directory down into a single file. If I were so inclined, I could do this on the Mac just by selecting the directory and choosing ‘Compress’ – that will create a .zip file containing the entire library.
Next, I copy the file up to S3. Because I’m a super-geek, I do this right on the command line using my Amazon credentials I created a while back. If you’re a GUI person, you can use any number of S3 clients for the mac or PC. For me, I do:
aws s3 cp 2014-09-23\ CA\ Over\ 15k.aplibrary.tgz s3://daveshevettphotos/ --profile personal
After some time (some of the libraries are quite large. A 25gig wedding archive took 85 minutes to upload) I have an offsite backup of that photo library! Hurray! At any point I can go to the Amazon S3 console and put these files into Glacier for long term storage, or download them as needed.
I realize this process is not for everyone. I share here to simply raise awareness that in the modern age, many of our most important things are stored in an ephemeral, easily lost way. Take the time to look around and see what you could lose if something were to happen. Something as simple as your laptop being stolen, a broken water pipe, or even a home fire. Always ask.. “What if…”
In part 1, I described the new sport of FPV Drone Racing. In this posting, I’ll tell what it’s like to try and take those Youtube videos and star-eyed ideas and make them real – IE, build and fly my own drone.
Once I understood the details of what a 250mm racing drone was, I had to buy one. Getting parts and pieces and assembling the entire series from scratch was daunting. What sort of ESC’s, what sort of flight controller, etc etc.
That was March 14th. Little did I realize, I had made a classic blunder that’s all too common in this new sport. The frame I ordered was from China, and would take at least 3 weeks to arrive. Agony! Oh well. Lets make the best of it. I spent the intervening time building out my secondary parts inventory. A transmitter and receiver. Batteries. A carrying case to hold it all. Charger. I would be ready.
Finally, the frame arrived, and it was time to get to work. I unpacked the (small) box and laid out all the parts. Have to admit, the box looked less than promising. After driving myself bonkers looking at FPV videos, talking with folks online, etc, this sure didn’t look like what I had hoped it would be.
Opening it up and sorting through parts, things started looking better. Everything was there, and it even looked pretty good. Machining was good, parts were as expected, all I needed to do now was put it together. I had chalked off the evening to do the assembly, and it took all of that to get from “piles of parts” to something that started to look like an actual drone.
Anyone who has ever built an RC model knows what comes next. Doesn’t matter that this thing you’ve dreamed about sorta looks like what you imagined, you have a long road between “Looks done” and “it’s in the air”. The first trick was wiring the power harness so all of the ESC’s would have power to drive the motors. Some drones use a Power Distribution Board (or PDB), but this particular configuration didn’t have one, so I needed to wire up my own. Lots of soldering later, I realized the power connectors on my batteries didn’t match anything I had, nor did they match the charger I was using. Arrrgh. I suppose this is what happens when you build something from scratch, on a platform that really hasn’t solidified.
Somewhere around here I joined up with the MakeIt Labs folks up in Nashua, NH. They have a pretty rabid drone group there, and these guys were unbelieveably helpful in guiding me up this steep learning curve. I learned that most folks use XT60 power connectors, so I ordered up a handful of those.
So, ready to go, right? Yeah, not so much. My FC (Flight Controller – a CC3D from the OpenPilot) needed to be programmed and calibrated with my motors and ESC’s. This is not a trivial process, and I was getting frustrated that my motors were not spinning up appropriately. Turns out, I had a blown ESC. ANOTHER BLOCKER. After much hand-wringing about ‘can you mix different kinds of ESC’s on a single quadcopter’, I took the plunge, ordered 4 more ESC’s, and after they came in, installed one onto the drone. More calibration, and… okay, now the motors are spinning under test, but are not responding to radio control at all. On the other hand, it LOOKED like a drone, smelled like a drone, it just… wouldn’t fly like one. (BTW, after sharing this picture, the folks at the lab were like “That’s a STUPIDLY large battery. You know most folks fly with a 1300mAh battery, right? You’ll save weight and space using a more appropriately sized battery). So, 2 new batteries ordered.
Here I have to give a bit of a shout out to the OpenPilot peeps. I understand there’s a little back and forth in the community about who owns the software, who owns the boards, and the like, but the OpenPilot GCS (ground control station) software is outstanding – running flawlessly on my Mac and giving me enormous control and detailed information about my flight controller. The CC3D controller itself can be had for around $25, and, as a geek who has seen some pretty complex little controller boards, what this thing can do is nothing short of amazing, for such a low cost. Very fast signal processing, control, and durned good communication / feedback to the groundstation software. The CC3D flight controller is being slowly replaced by the Revolution board, but that’ll be an upgrade for the future. Right now, I love my little flight controller, and am so grateful to the developers and community that made it possible.
Eventually I got all the factors aligned, and my drone took to the air. Flying Line Of Sight (or “LOS”) is the normal way people expect RC planes to fly. Watch the craft in the air, learn the controls, and fly around. My first few flights were just this… zipping around, feeling what it could do. I quickly learned what most pilots learn – it’s easy to fly your craft when it’s oriented directly away from you. Where left is left, right is right, forward is forward, etc. But once that vehicle turns and is coming toward you, all the controls are reversed. Thing flying toward you too fast? You pull back on the pitch stick (pull it toward you) to slow it down and pitch up. That’s not intuitive! I still have not worked this out – and in talking with other new pilots, I’m not alone here.
Eventually though it was time for the next step. First Person View, or FPV flying. In a nutshell, my drone has a small digital camera mounted on the front, and that is in turn wired to what amounts to a television transmitter. This signal can be sent back to a ‘groundstation’, or a set of goggles with a receiver and antenna. After some back and forth determining how to use goggles with my glasses (I ended up removing my glasses and wearing the goggles in a way that puts the screens a half inch further away from my eyes than normal. This works) – I was ready to fly.
This video is pretty much what happened. Did I fly? Yep. Was I able to be ‘on board’ and see what the drone sees? Sure enough. Was it the leaping, “Lo, I have slipped the bonds of earth” experience I was hoping for? Not even remotely. Next big lesson: Flying FPV is REALLY REALLY HARD. A drone doesn’t fly like an airplane – it doesn’t bank and swoop. In a wind, it behaves erratically and unintuitively. So naturally I crashed. A lot. Dozens of times. And each time, something would come off, something would break, things needed to be tuned… it was… exhausting.
That video was made around 6 weeks ago. Since then I’ve replaced all my motors, rebuilt the camera mount,installed a new camera and video transmitter, heck I’ve remounted virtually every component on the frame.
The result? I’m… starting to enjoy it! Flight times are up, crashes are down, maneuverability is comfortable – we’re not yet ready to go tearing through concrete tunnels, but I can make loops around the field and mostly not crash into trees now. My drone is still tuned to a very basic level of responsiveness. I’m not doing crazy flips and the like – and frankly, ain’t gonna do that anytime soon. But… well, take a look at how I’m flying now. This was in the same field as the first video. Check it:
Am I super-pilot? Not even remotely. Am I starting to feel like this is fun, and lets me experience, in a weird way, what it means to fly? It comes close… and I’ll keep trying.
I’m a rabid user of Evernote and it’s associated screen capture tool, Skitch. I use it for just about everything, and regularly snap screenshots to share what I’m seeing with coworkers.
I’m aware that my screenshots are stored in my Evernote account, and there’s a disk space limit there. I’m okay with that, free services have to put limits on things. If I start running low on space, I go into my notebooks and start deleting things.
Yesterday though, I was suddenly stopped from being able to share screenshots via an alert from Evernote that I was over my monthly upload limit, which would reset in 9 days. There is absolutely nothing to do to fix this except wait, or pay money to release it.
This smacks of ransomware. My service has been interrupted unless I pay up, a service that up until now has been free. I have no way of ‘getting out of jail’ unless I cough up some dough, or wait over a week – and if I just wait, it may happen again next month. In addition, everytime Evernote tries to sync now, I get a modal dialog box that says “Cannot sync [Learn More]” – you can’t dismiss this box, you must click on Learn More, and get their little ad asking for money. Thanks guys.
I’ve been considering paying for my Evernote Pro license, because I find the service quite valuable, but this… come on guys, this was a bad decision. You’re already limiting how much data I can store with the free version. Now you’re limiting how many I can upload, even though I have plenty of storage space? Ung.
It’s become sadly apparent that Google Plus, the service we had all hoped would dethrone Facebook and become a more open, useable, and at least mildly privacy aware environment, is rotting on the vine. Features are being spun off into standalone products, and long hoped for features have never materialized.
So I’m falling back to the old standby. A year or two ago I completed rebooting Planet-geek and have been enjoying using it as my primary platform, so I’m going to take the final jump and make the blog my primary posting platform, while letting the fairly awesome SNAP tool from NextScripts repost / share things out to various social networks.
Right now I’m echoing posts to Facebook, Twitter, and Livejournal, but may add other sites going forward (NextScripts supports dozens of different systems). Any requests?
While the FriendsPlus.Me tool I was using was ‘Okay’, I wasn’t happy with the several levels of redirects and “you must source your post from G+” setup. This way, my blog is the authoritative source of my ramblings, just as I want it to be. But I understand if you don’t want to subscribe to my RSS feed or are more comfortable on other platforms, well that’s fine, you’ll still see my happy chatter.