I wanted to deactivate Office 2011 on my old Mac so I could activate it on my new Mac…so here’s a nice little rant about how well THAT went. Two more cool ways to show your Library folder on a connected drive or a remote Mac – and of course there’s a Clarify Tutorial over on Podfeet.com. I walk you through the experiments I’ve been doing trying to figure out how to effectively create the Live Show at podfeet.com/live using a new Mac with only one audio jack. In Chit Chat Across the Pond Bart walks us through how to use keywords in Aperture to find your photos.
Hi this is Allison Sheridan of the NosillaCast Mac Podcast, hosted at Podfeet.com, a technology geek podcast with an EVER so slight Macintosh bias. Today is Sunday November 17, 2013 and this is show number 445.
Why I don’t have a cool article for you about some disk speed tests I’ve been running:
- Needed a spreadsheet and graph – tried Numbers, wouldn’t do what I wanted, tried Google Docs, didn’t do it well
- New Mac, trying to move Office
- Deinstall from old Mac
- Web search
- Live chat started at 14:34 – told me to uninstall to deactivate it. Didn’t believe her. Told me to call Customer service. 27 minutes
- Customer Service – very hard to get to Mac
- Found Knowledge Base Article http://support.microsoft.com/kb/2398768. It said to delete about a gazillion plists step by step.
- Ran AppDelete While waiting
- Then poked around where they said in /private/var/db/receipts and found 1720 plist and bon files still in there, 51MB total. Found more garbage in Library/Application Support
- Finally got through, when Elly told me to uninstall I told her about the KB.
- FINALLY she said she’d deactivated the old computer
- Still wouldn’t take my activation key
- She said it was because I installed on the new Mac before uninstalling on the old mac, so I should do the uninstall again on the new Mac and then reinstall on the new Mac
- After 1 hour 19 minutes – Still don’t have Office working
Update on Remote Library Folder
Last week I told you how proud I was of myself that I thought of suggesting to Eddie Smith on Twitter that he use the Terminal to find his Library folder on his backup drive. This week I found an even better way. Let me set the stage on why I needed it – turns out I needed yet another thing from my Application Support folder inside the Library folder from my previous Mac. I use Transmit as my FTP client, and it stores Favorites in the Application Support folder.
I figured this would be an opportunity to exercise my mad Terminal skills to copy the file from my remote machine to my new Mac. Turns out my skills are just not as mad as I thought yet and I crashed and burned. FINE. I had to backslide and use the Finder. Then I realized I’m right back where Eddie was, I can’t SEE the Library folder. I did some hunting online and found Flood4 on an Apple Discussion Board with the perfect answer. Here’s the trick. In the menu bar select Go, and pull down to Go to Folder. A little window will pop up where you can type a directory path. Instead of figuring out the syntax, you can be lazy. Open a second Finder window, navigate to your user directory on the remote drive/Mac, and drag that folder into the pop up window. It will auto-populate with the path to your user directory. Now here comes the trick…type /Library (capital L) at the end and hit Enter. You’re in!
Steve Davidson sent over another cool tip on this, one that will permanently fix the problem of not begin able to see the user Library folder on a remote disk or Mac. For the remote Mac, ssh in as we learned last week (http://nosillacast.clarify-it.com/d/xj3aef) and then simply type in
chflags nohidden ~/Library/
From now on when you use the Finder GUI to look at that Mac, the Library will be visible to you. On the backup drive, you have to be a bit more specific to tell it where to look, but it’s still not too bad. I’ll read it off here, but basically you’re just saying chflags nohidden path to the Library:
chflags nohidden /Volumes/BackupDriveName/Users/YourUserName/Library/
I tested Steve’s directions on both my second Mac via ssh, and on my backup drive. They both worked great so now I can forget about it forever! Till the next time I need to remember it. And of course I created a Clarify Tutorial over on Podfeet.com on how to do it.
Only One Audio Jack on the new Retina MacBook Pro
I’ve been working really closely with the great people over at Rogue Amoeba (rogueamoeba.com) trying to figure out how to configure my new MacBook Pro for the live show. Let me set the stage for why I have a problem to be solved and how they can help.
For the live show I provide two different ways to participate – you can watch live video/audio as I create the show, or you can just listen to the audio (while in both cases playing in the chat room). I create the video portion using a tool called Wirecast Studio from Telestream.net, and I create the audio-only portion using a tool called Nicecast from rogueamoeba.com. I create these two different sources because people have different needs. If you’re blind, you actually can’t “see” the Flash play button on podfeet.com/live with a screenreader, so instead I pipe the audio only stream over to the same page. If you’re on the go and you want to use the fine NosillaCast app created by Donald Burr, you need an audio stream for your iOS device. Again, Justin being Flash, it doesn’t play nice on iOS.
Ok, so now we’ve established why I have two streams. The next piece of the puzzle is to get two sources of audio to go to both streams – my microphone so you can hear me, and GarageBand so you can hear what I’ve recorded when editing, and the listener contributions and Chit Chat Across the Pond when I do playback. If you only heard me live, trust me, it wouldn’t be that interesting! So I need to send a combined audio source to Nicecast and Wirecast Studio of both my mic and GarageBand. In the past, on my previous MacBook Pro I had two audio jacks – one for line out (headphones) and one for line in. This is going to get tricky now, so stay with me.
I have a hardware device called the iMic that plugs into USB and line in. I used a program called Audio Hijack Pro to hijack the audio from my Mic and GarageBand and piped it straight to the the iMic on USB. The iMic in turn, sends it right back into my Mac as a Line In signal. That’s important because applications like Nicecast and Wirecast Studio can “see” a hardware device on Line in. It was a bit of a kludge of course, because I’m making analog audio noises with my mouth, the Mic is converting it to digital and shoving it into the Mac via USB, then I’m telling the Mac to shove it back out over USB to the iMic, and send it BACK into the Mac on Line In which is analog, so the Mac has to convert it a THIRD time to digital. Whew, it’s amazing you could even hear me after all that A/D/A/D conversion!
As bad as all that sounds, it worked flawlessly for the last couple of years. Now I got the new MacBook Pro, and guess what? It’s only got ONE audio jack. That means if I plug in headphones to work on the show…I’ve got no line in jack any more for the iMic. I played around with the monitor jack on the iMic and couldn’t get it to let me listen in with headphones, so I was stumped.
I mentioned at the top of this that I started working with the Rogue Amoeba folks, specifically my new best friend Chris Barajas. What’s cool is that to Chris, everything I’m talking about here sounds like perfectly normal stuff to do! I’m sure most of you think I’m nuts by now, but Chris got it right off the bat.
So here’s where I’m 90% solved as of today. I’m going to replace the physical iMic with a piece of software that can do much of the same thing. It’s an oldie but a goodie – it’s called Soundflower from cycling74.com. Quoting the cycling74 website: “Soundflower is a Mac OS X (10.2 and later) system extension that allows applications to pass audio to other applications. Soundflower is easy to use, it simply presents itself as an audio device, allowing any audio application to send and receive audio with no other support needed. Soundflower is free, open-source, and runs on Mac Intel and PPC computers.”
Soundflower hasn’t been updated for Mavericks yet, the latest version is compatible with 10.8 Mountain Lion but it actually does work for me under Mavericks. Soundflower is actually a kernel extension which I normally don’t recommend because you’re mucking about in the underpinnings of your operating system. Soundflower has been around for ages and never given me a lick of trouble though so I feel comfortable with it. After installation you get a new Menu Bar app oddly enough called Soundflowerbed, never understood why it’s a separate name.
Now using Audio Hijack Pro I can hijack my mic and send it to what’s called Soundflower (2ch), hijack GarageBand and also send it to Soundflower (2ch), and then in Nicecast I can choose Soundflower (2ch) as the input audio and it works great! I should mention that all of this is on Mavericks so good onya Rogue Amoeba for getting everything working so quickly.
Unfortunately Wirecast Studio is another story. It isn’t yet Mavericks compatible but I figured it was worth a shot to try it anyway. Sometimes you get lucky and it’s some minor feature you don’t need that doesn’t work, so it’s often worth trying.
In Wirecast Studio you create what are called “shots”. I have a shot I call 3-up which is made up of three video sources. There’s a vertical panel on the right that shows the live chat from a virtual camera source running on a second Mac, and then on the left side I have two more video sources – video of me across the top and GarageBand running on my Mac below that. By the way GarageBand is also seen as a virtual camera source as well. These virtual camera sources are created by another Telestream application called Desktop Presenter.
That’s the video sources. On another layer I have an audio shot, which used to be Line In from the old Mac, but now it has to be set again to Soundflower (2ch).I started my experiments working with Wirecast Studio 4, the older version, and it actually worked! When I spoke into the microphone you could “hear” it in Wirecast Studio, and it could hear GarageBand. the problem though was the virtual video source for GarageBand would not render. It shows up fine in the preview but for some reason it just shows as a big giant red question mark on the real video.
Ok, let’s flip to the newer version 5 (again not certified for Mavericks yet). I almost got it to work. I can now get the audio to come through to Wirecast Studio, and the video sources work perfectly, but there’s some jittering on the video and audio that would drive the audience nuts. As I said, Telestream has not yet released a Maverick’s version but now I’m really optimistic that we can get this thing working as soon as they do have it ready.
Live Show and Google Plus
Let’s take a quick intermission to say hi to the folks in the live chat room. Every Sunday night at 5pm, even when I’ve been off playing around for the weekend, we join up over at podfeet.com/live where you can watch me make the show, laugh when I goof up, and chat with other NosillaCastaways about whatever you want. Sometimes they even talk about tech! Another fun way to hang out with your fellow geeks is over at our Google plus page at podfeet.com/googleplus. Steve posted a link there earlier this week on an iDevice Availability notifier, Steven Goetz posted in our “NosillaCastaways Show Off” area some cool photos he took this week, and George from Tulsa reported out on his methods to convert his MP3 library to lower quality versions to put on a portable device with small storage. Both the live show and the G+ community are thriving and everyone’s having fun so you could go check it out too!
As you know I’ve been enjoying my adventure of moving from one Mac to another, and finding all the little bits and pieces you forget about when you don’t use my Migration Assistant. One of the things I decided to do by hand, was to re-create all of the specific settings inside Feeder, the application I use to create the show notes and publish the podcast. I was kind of pleased with how I did this. Starting on the Mac with the correct configuration, I opened up Clarify and I took screenshots of every single one of the settings pages. Instead of saving that tutorial to my disk, I pushed the button to publish it to Evernote. I made sure Evernote synced to the cloud, then I went to the new Mac and opened up Evernote. There were all of the settings I needed to get feeder set up perfectly on my new Mac. I was really pleased with how easily this worked so I encourage you to remember to use Evernote to store your tutorials when you using clarify. The story gets better though. This week Todd McCann had some problems with his podcast and he lost all of his settings on Feeder. He asked me if I had any documentation of the way I set it up. Luckily he’s a pretty sophisticated user so I didn’t have to write up advanced explanations of each of the settings, I was able to just send him the tutorial I made for myself and he was back in business.
I can’t imagine doing half of what I do on my Mac without clarify. Head on over to clarify-it.com and download the free trial, so you can make yourself happy and your friends happy. Remember, don’t buy the Mac App Store version right now – because you won’t get the free upgrade when Clarify 2 comes out so be sure to go to clarify-it.com to buy.
Chit Chat Across the Pond
Important Security Updates:
- Apple release iOS 7.0.4 including App Store security fix – http://support.apple.com/kb/HT6058
- Adobe release flash security update – http://www.adobe.com/support/security/bulletins/apsb13-26.html
- Last Tuesday was patch Tuesday, it did include 10 fixes in all versions of IE (including 2 zero-days), but did NOT patch the TIFF Zero-day (more below) – http://technet.microsoft.com/en-us/security/bulletin/ms13-nov
- OpenSSH found a dangerous flaw in their Secure Shell (SSH) daemon which MIGHT be exploitable on Nov 7, and released a fix the next day. If you run Linux keep an eye out for the update, and install it promptly (no, Apple have not patched it yet in OS X) – http://nakedsecurity.sophos.com/2013/11/09/openssh-fixes-potential-remote-code-execution-hole/
Important Security News:
- Sadly the bad-guys are trying to cash in on the Philippines Typhoon disaster – beware of emails asking for your help, that could well be scams. Do please give, but use donate buttons from reputable sources (e.g. iTunes Store & FireFox splash screen), or directly on the websites of trustworthy organisations like the International Red Cross – http://www.us-cert.gov/ncas/current-activity/2013/11/12/Philippines-Typhoon-Disaster-Email-Scams-Fake-Antivirus-and
- The Adobe story just keeps getting worse – they didn’t just lose a LOT of data, their encryption implementation was deeply flawed too – passwords were encrypted and not hashed, the choice of encryption algorithm unwise at best, and they didn’t encrypt the password hints at all. The result, every instance of the same plaintext password encrypts to the same value, so you can aggregate all the hints to the same password together, and easily crack thousands and thousands of people’s passwords without ever breaking the encryption. And, should they key ever be discovered, that one key will unlock ALL 130 million passwords, and, perhaps, if they used the same key, all the credit card numbers too – http://nakedsecurity.sophos.com/2013/11/04/anatomy-of-a-password-disaster-adobes-giant-sized-cryptographic-blunder/
- At least XKCD could make some quality nerd humour out of Adobes blunder – http://xkcd.com/1286/
- Microsoft warn Windows users about a Zero-day TIFF exploit that is being used in the wild (and have released a ‘fix it’ tool to bluntly ‘fix’ the problem by disable TIFF support) – http://nakedsecurity.sophos.com/2013/11/06/microsoft-warns-windows-users-of-zero-day-danger-from-booby-trapped-image-files/
- MacRumours got hacked – http://www.macrumors.com/2013/11/12/macrumors-forums-security-leak/
- Apple’s have released their quarterly report on government data requests – interestingly they’ve included a legal canary in the coal mine – they can never say that they have received a secret request from the FISA court, but they can tell us when they haven’t, which is what Apple have done in this report. If they line is gone from a future report, it would seem to be a sure bet that they did get a secret FISA request –http://nakedsecurity.sophos.com/2013/11/07/apple-publishes-new-transparency-report-is-there-a-warrant-canary-nesting-inside/ & http://tech.fortune.cnn.com/2013/11/07/apple-nsa-snoops-police/
- Google’s latest Transparency report shows world-wide data requests have doubled since 2009, but US requests have tripled – http://nakedsecurity.sophos.com/2013/11/15/google-us-data-requests-have-more-than-tripled-since-2009/
- FireFox have released a plugin called LightBeam that visually maps the relationships between the websites you visit to help users better understand their online privacy – https://blog.mozilla.org/press-uk/2013/10/28/lightbeam-for-firefox-privacy-education-for-users-open-data-for-publishers/
- Microsoft make pro-active moves to boost the strength of crypto on Windows – http://nakedsecurity.sophos.com/2013/11/14/microsoft-leads-the-way-setting-new-cryptographic-defaults/
- FaceBook are pro-actively mining the leaked Adobe data, and forcing FaceBook users who re-used their Adobe emailaddress & password on FB – http://krebsonsecurity.com/2013/11/facebook-warns-users-after-adobe-breach/
- The surprisingly human story of the first botnet – http://www.washingtonpost.com/blogs/the-switch/wp/2013/11/01/how-a-grad-student-trying-to-build-the-first-botnet-brought-the-internet-to-its-knees/
Memorial Day/Veterans Day Intermission
Given that this week both those of us in Europe and America remembered our forefathers who fought for the freedoms we enjoy, I thought this story from Naked Security was worth sharing: http://nakedsecurity.sophos.com/2013/11/15/in-memoriam-mavis-batey-mbe-codebreaker-extraordinaire-at-bletchley-park/
Main Topic – Meta Data & Aperture
Before I go on to describe how I use Aperture’s key wording system to help me keep order on my photos, I want to step back and look at a bigger question – how do you put order on data?
When ever you’re asked to organise anything, it’s easy to assume that a simple hierarchy of files in folders in folders will do the trick. You start organising, and initially all goes well – then you meet a special case. Lets take music as an example – you decide you want to manually manage your MP3 collection because you don’t trust iTunes. You start by creating a folder for your favourite artist, and in there you put a folder for each album, and then the relevant tracks, one by one. This goes well for the first 10 or 20 albums, and then the exceptions start to ruin your day.
Things begin to go wrong when you come across your first compilation – do you split the album and scatter the tracks across the folders for all the artists involved? Or do you make a new meta-folder called “compilations”, and dump the compilations in there? Either way, finding your content just got harder. If you go with the first compromise, it becomes near impossible to find the album, if you go with the second, it becomes very difficult to find all tracks by a given artist in your fancy file system.
Similarly, what do you do about duets?
However, things really go off the rails when you import your first classical music album – now things REALLY break down. Instead of album, artist, and track, you now have album, piece, movement, conductor, orchestra, soloists, and composers. It is very realistic to want to be able to find all the movements that make up a symphony, and to want to find all pieces by a given composer, and to find all performances by your favourite orchestra or soloist, as well as wanting to be able to find albums of related pieces that you bought.
It’s impossible to create a folder structure to encapsulate all this information. If you want to do it in folders you have to choose one type of search to facilitate, and forget about the rest.
What you need to put order on your music is metadata – data about data, or in this example, data about your music files. You need the data itself in some kind of standard format, and you need an app that can read and interpret that meta data. In other words, you need ID3 tags, and you need a jukebox app.
Note that metadata is data IN CONTEXT. If I named my MP3s using some kind of naming convention I could appear to capture all the metadata I could possibly need, but I actually wouldn’t be, because the data would be lacking in context. I could try search my filenames for all the songs I own by the band “Blue”, but that search would throw up all sorts of false positives because every file name that contains the letters B, L, U & E in that order would not be songs by the band blue! They would include songs with Blue in their title, and songs in albums called Blue.
The same principles apply to photographs. Encapsulating information about a photograph in the title of the photograph is infinitely better than not capturing the information at all. But, that information is lacking in context. If I search for “Flanders”, I will get both photographs of my beautiful native land, and my zany evangelical neighbour Ned!
The big question then is this, is there a photographic equivalent of ID3 tags? Yes! EXIF + IPTC data!
EXIF stand for “Exchangeable image file format” – http://en.wikipedia.org/wiki/Exchangeable_image_file_format
IPTC stands for “International Press Telecommunications Council” – http://en.wikipedia.org/wiki/IPTC_Information_Interchange_Model
IPTC and EXIF metadata can be imbedded into most image formats used for storing photos, including the most important two, JPEG & TIFF. Image organising apps like iPhoto, Aperture, and LightRoom all understand EXIF and IPTC data, and when you export photos these apps will embed the metadata into the image (unless you ask them not to). This standard metadata can then be picked up by other apps or websites you share your images on. When I upload an image exported from Aperture to Flickr it automatically pre-populates the Title, Description, GeoTag, and Keywords straight from the IPTC data in my JPEG. This saves time, and, it makes the metadata available to search engines, which make my images easier to find, and bubbles them up to the top of relevant Google Image searches etc..
Having proper metadata also means you can let Aperture do more work for you. One of the biggest differences between Aperture and iPhoto is the metadata support when searching your images, and when setting up smart albums – Aperture will let your search on any EXIF filed, any IPTC field PLUS a whole lot more Aperture-only metadata like project name and even adjustments applied.
Bottom line – for data to be at it’s most valuable, it needs to be contextualised. Hacking image titles to serve as keywords of a sort is definitely a lot better than nothing, but it’s the digital equivalent of hammering in a screw – it works, but there is a better way!
A Quick Summary of my approach to metadata in Aperture:
1) I geotag all my shots – I often strip the geotag out when publishing to the web, but within Aperture all is geotagged so all can be searched by place
2) I keyword all photos
3) I use Aperture’s nested keywords to group my keywords into related ’taxonomies’ – one parent for ‘flora & fauna’ with sub-sections for flowers, trees, insects, animals, birds – one parent for ’transportation’, one parent for ‘astrophotography’, one parent for process-related metadata (e.g. ’tonemapped’, ‘light painting’) and so on. What ever I care about, I create a nested category for, and within that category I apply the organisation that makes sense.
4) I use the HUD to search for tags, then drag-and-drop them onto many images at once – if you search for a nested tag, the HUD shows all parent tags, so if I just search for “Common blue” – I get Fora & Fauna + insect + butterfly + Common Blue (and also insect + damselfly + common blue), I can then quickly drag and drop all the elements of that taxonomy onto all the photos I took of that insect.
That’s going to wind this up for this week, many thanks to our sponsor for helping to pay the bills, Blue Mango Learning at bluemangolearning.com makers of ScreenSteps and Clarify. Don’t forget to send in your Dumb Questions, comments and suggestions by emailing me at [email protected], follow me on twitter and app.net @podfeet. Check out the NosillaCast Google Plus Community too – lots of fun over there! If you want to join in the fun of the live show, head on over to podfeet.com/live on Sunday nights at 5pm Pacific Time and join the friendly and enthusiastic NosillaCastaways. Thanks for listening, and stay subscribed.