back to panic.com

Panic Blog

From the desk of Cabel
Portland, Oregon 97205

The Panic Status Board: 2013 Edition

You might be familiar with where it all started: the status board we put on our Panic office wall in 2010.

Since then, as you may know, we turned that status board into an iPad app called Status Board. Now everyone can have a cool, beautiful, data-packed status board on their desk or wall.

And since we built the app, we rebuilt our status board, making it twice as good! (Literally.)

Panic Status Board

No, you’re not seeing double — this time we went with two goofy screens of stuff.

It’s pretty glorious.

About The Panels

Here are some implementation notes on our board:

Status Board - RevenueTraditionally Panic is quiet about how-are-we-doing data. It always feels like a possible distraction for our hard-working team. But we’re always changing, and this revenue Graph panel has been fascinating. Every day a script totals up our direct sales data, then retrieves our App Store sales data using AppFigures and their nice API. The totals get dumped into a database, and then we make that available via a simple PHP script that outputs JSON to the Status Board. That might sound tricky, but all told it took about a day of work to make happen.
Status Board - UnitsUnits have been especially interesting since they reveal so much about the economics of (our) iOS software, as this Graph panel shows. Although (our) iOS apps sell a respectable number of units, the revenue they bring in barely charts compared to our Mac stalwarts. So far! We’re working hard on improving our iOS apps, and trying new ideas, in order to crack the iOS market a little bit more. (Sorry this chart was pre-Status Board, which is doing well!) By the way, Graph documentation is here.
Status Board - InboxThe Support team works tirelessly to fight this tide! This is just an Email panel, which ties into our IMAP server. It took about 3 minutes to set up, and has been incredibly useful to see what our support load is at a very quick glance. (On the server, each Support person shares a single “Help” IMAP account, which has folders for each support person, and a script distributes the incoming support requests round-robin style.)
Status Board - SentConversely, this Graph panel this is a great way to quickly see how many support responses are going out the door. (Of course, it’s not a competition — it’s just for fun.) To get accurate Sent counts, we have a script that looks at both outgoing Twitter replies, and outgoing e-mails, and totals them up per-person into JSON.
Status Board - ProjectsThis list is using our Table panel, connecting to an HTML file on our server. (Table documentation is here.) This is an edited version to protect our secret projects, of course. A project list is always tricky, since it’s the most manually-updated thing on the board, and always runs the risk of being stale. But, it’s fun to see who’s working on what.
Status Board - Sparkle
What version of OS X are our users using? Using StatHat, which lets you track data incredibly quickly, I added one line of code to our PHP script that handles Sparkle updates. StatHat can output to Status Board natively via the Graph panel. Boom: instant OS version graph. (Also, fascinating how people use our Mac apps during the day, and not very much on the weekend.)
Status Board - Car2GoThis is our car2go map, so we can quickly see if there are any cars near the office that we can hijack and drive home at the end of the day. It’s totally custom — we’re using the Do-It-Yourself panel so it’s just a little web page on our server. We signed up for the car2go API and combined their data with Google maps and some nice CSS animation. If enough people are interested, we might make this available to others. (Does your city have car2go?)
Status Board - TriMetThis is another Do-It-Yourself panel to show everyone’s bus lines. Sometimes end-of-the-day conversations are abruptly interrupted when we notice a bus is nearby. Logan has more recently made his own TriMet panel that we like a lot.
Of course, we’re also using the stock Weather, Twitter, and RSS panels for different things. And naturally, the Clock, to show the current time in Portland, Seattle, and San Francisco. You know, for conference call scheduling.

Hardware Notes

  • This time, we chose the Samsung DE55A 55″ Professional Display. Bright, thin bezel, built to stay on.
  • To cover up the Samsung logo, we used a piece of black non-glare artist tape. (Electrical tape was too shiny.)
  • We installed a double gang outlet in the wall, to support 2 TV’s and 2 iPad chargers. Permanent power.
  • We applied 3M Magnet Tape to the back of our iPads. They just stick right to the back of the display:

Back

As people continue to build new things, our Status Board seems to change every week. Since taking these photos we’ve already added GoSquared, SNMP traffic graphs, and much more. That’s the best/worst thing about Status Board: it’s now so easy to make a cool Status Board that it’s hard to know when to stop. But hey, it’s fun!

If you’ve used Status Board to make a cool status board, send us a photo!

Posted at 12:04 pm 45 Comments

From the desk of Cabel
Portland, Oregon 97205

Status Board Mania!

It’s only been about a day since we unleashed our Status Board app to the world, and we’ve been truly astonished by the amount of cool things people have built to make it even more useful and amazing.

Here are some of the greatest things we’ve seen so far.

banner

Dead Simple Greatness. One click for new things:

  • TubeTracker — an incredible one-click layout for people in the UK who rely on the tube (pictured above)
  • AAPL — simple panel for Apple’s stock price (don’t follow too closely or you’ll go crazy)
  • LastFM — see your last-listened track
  • App Store Review Times —  a great way to see how busy Apple is
  • WWDC Alert — but really, how fast are tickets going to sell out this year
  • Bart Arrival Times — for those of you in San Francisco

New Native Sources. Direct-from-the-source data for your Status Board.

  • LeafPing — output your Envato sales data to Status Board. An example.
  • uri.lv — track your podcast statistics on the big screen.
  • AppViz — this must-have app for App Store sales tracking can now output to Status Board
  • Don’t forget our amazing launch sources: StatHat (so useful!) and Hockey.

Sources/Conduits. Some code experience necessary to get these going:

  • Nest — a quick look at the temperatures on N
  • Server Statistics — keep an eye on your server loads
  • OmniFocus — a Python conduit to get your tasks up and running
  • Google Analytics — 7-day website stats
  • Jenkins — display running jenkins jobs in a table
  • TimeTiger — interestingly, a Windows app for time tracking
  • Mint Analytics  — a Pepper to create Status Board-compatible web stats
  • Mite — time tracking reports
  • Things — a way to get your Things to-do lists up and running
  • AppFigures 1 — a conduit for displaying your AppStore sales data
  • AppFigures 2 — another simple PHP conduit for AppStore sales data
  • BitBucket Issues — track open issues in Git/Mercurial hosted source

(And you can always add a new Do-It-Yourself panel and point it to always-running Mario.)

We’ve heard of some fantastic web services working on native Status Board data, including AppFigures and GoSquared. Stay tuned!

Finally, some unofficial third-party sites are springing up to track new things: Pinboard, StatusBoardWidgets.com, and StatusBoardApp.info.

And we love seeing photos of people’s Status Board installations, such as this one in a Ducati dealership:

BHmy3FyCIAA8FAj.jpg-large

Keep sending us your cool things! Tweet to @panic or give us an e-mail!

Posted at 1:34 pm 20 Comments

Introducing Status Board. Beautiful data for all.

Quite some time ago, we made a cool Status Board for the Panic office.

We were immediately taken by how awesome and important data becomes when it’s displayed in a beautiful layout in a prominent way. To glance up and see how the support inboxes were looking was priceless. It became the closest thing we had to a water cooler (second to our weird snack wall). It’s our focal point.

We thought: everybody deserves this kind of beautiful data.

So we got to work and made an iPad app. We built easy-to-use panels for anybody, to quickly see tweet counts, mailbox counts, news feeds, weather, and more. We built pro panels to display graphs and tables that can be fed with simple data feeds and be used to display virtually anything imaginable. We made it incredibly easy to set up. We made it look good sitting on your desk while you work. And (as an extra in-app purchase for professionals) we even added TV Out so you can throw this data up on your wall on a giant dedicated screen, for real.

Introducing Status Board. A brand new iPad app from Panic for displaying your data.

appstore-download

But this is just the beginning: our 1.0. Tell us what panels you’d like to see added. Tell us how you use it, or you’d like to use it. Let us know if you find any bugs by e-mailing us. We’ll be listening and working like crazy as always.

(Thanks to Greg, Neven, Dave, Kenichi, James, and the whole Panic crew for their excellent work on this.)

And stay tuned for a blog post about our new internal Panic Status Board on the office wall — which is now using, of course, this new app. It’s twice as amazing.

PS: Remember how we got obsessed with the poor video quality of the Lightning AV Adapter? Now you know why!

statusboard-shot

Posted at 1:46 pm 44 Comments

Coda 2.0.8 Beta 1, Cabel

April 5th, 2013

The wheels continue to turn: here’s a test of Coda 2.0.8.

We’re working on bigger additions to Coda, but in the meantime we’ve been fixing lots of little things. This update should improve stability and speed, and adds Transmit 4.3 favorites importing.

If you’re interested, get Coda 2.0.8b1 here (52 MB).

If you find anything weird, let us know ASAP via Hive!

Posted at 3:23 pm 16 Comments

Wanted: Office Manager & Non-Technical Support

Hello! Panic, Inc., a software developer for Mac, iPhone, and iPad is seeking an Office Manager at our 15-employee headquarters in Portland, OR. A rare, non-technical Panic job!

Candidate must already live in the Portland area, and be able to start immediately. Our office is located downtown, across from Powell’s Books. This is a full-time position.

Typical job duties include:

  • Being on-site weekdays from 9 AM – 6 PM to answer / screen phone calls, take messages, and receive deliveries and visitors
  • Answering general support emails, helping users get up and running, and forwarding technical questions when necessary
  • Handling voicemails and, somehow, faxes
  • Writing checks and paying bills immediately
  • Coordinating occasional social and corporate events such as company dinners, talks, conferences
  • Recording company meetings
  • @answering non-technical queries via Twitter
  • Responding to credit card disputes and refunds
  • Following up with purchase orders for payment (accounts receivable)
  • Providing price quotes for companies interested in volume purchases
  • Maintaining office calendar (who’s in/out, any upcoming special events, birthdays)
  • Scanning receipts and verifying purchase data
  • Keeping the dishwasher sane
  • Welcoming guests & making travel reservations
  • Unexpected Cabel Tasks and miscellaneous errands

Technical knowledge beyond email and word processing is a definite plus, as is a sense of humor and easy-going attitude. We’d love someone who has a fondness for our products and technology in general. But being organized and reliable is critical.

In addition to base salary, Panic offers:

  • Medical, dental, and vision coverage after 90 days
  • Bi-annual profit-sharing bonuses
  • SEP IRA retirement plan contributions after first year
  • Flexible vacation policy
  • Reasonable, life-compatible hours
  • Convenient central Downtown location
  • Free TriMet passes and bike storage
  • A very beautiful and inspiring office, we think

Candidates of every race, gender, nationality, age, and orientation are encouraged to apply.

Sound good? E-mail your resume to us, (UPDATE: This position has been filled. Thanks!) and tell us about yourself. If we’re interested, we will send you additional details and possibly schedule an interview. While we won’t be able to write back to everyone, we really thank you for your interest!

Posted at 11:37 am 5 Comments

From the desk of
Cabel
Engineering Dept.

The Lightning Digital AV Adapter Surprise

We’ve been doing significant testing lately with video out using various iOS devices for an upcoming project. In doing so, we waded right in the middle of a strange video out mystery. It’s time to unravel that mystery. (Chung-chung!)

Mystery #1: 1600 × 900 Resolution, Tops

When we turn on “Video Mirroring” to send out an image through the Lightning AV Adapter, the system tells us that the maximum and optimum resolution we can do is 1600 × 900:

Lightning

“Hang on, that’s not 1080p!”, you’re saying to yourself. That’s exactly what we said!

When we plug in the old Dock Connector AV Adapter, the system gives us the 1920 x 1080:

Dock Connector

So that’s a bummer. Questionably, Apple’s iPad mini tech specs claim “up to 1080p” video out support, but we can’t figure out how that’s possible. Maybe they mean that the adapter upscales the 1600 × 900 image to 1080p?

Mystery #2: MPEG Artifacts

When you plug a device into a television, you expect a clean, crisp signal — a mirror of what you see on the screen. Right?

But not with the Lightning Digital AV Adapter:

jpegcompression

Not exactly the cleanest text in the universe! Look at all that edge garbage. (We don’t get these artifacts with the old AV adapter.)

Theory

We thought we were going insane. This is just an AV adapter! Why are these things happening! Limited resolution. Lag. MPEG artifacts. Hang on, these are the same things we experience when we stream video from an iOS device to an Apple TV…

You got it. After some good Twitter leads, and a little digging, we had our theory:

Is the Lightning Digital AV Adapter basically a small AirPlay-like receiver?

I don’t mean AirPlay the network protocol, but rather AirPlay the video compression system. It must somehow set up a connection with the very iOS device it’s plugged into. It’s in no way passing raw HDMI out from the device, but rather presenting said stream upscaled to 1080p.

“But wait”, you might be saying. “You mean to tell me there’s enough electronics in that tiny plug to support AirPlay streaming and decoding?”

It seems unlikely, doesn’t it? So out came the hacksaw.

chip-2

You would not believe how incredibly tiny those components are on the left. Smaller than anything we’ve seen, electronics-wise. What could all of those resistors be for?

Let’s flip it over:

chip-1

Your eyes don’t deceive you — that tiny chip says ARM. And the H9TKNNN2GD part number on there points towards RAM — 2Gb worth.

In short: it appears the Lightning Digital AV Adapter has a SoC CPU. 

So, AirPlay (or AirPlay-like MPEG streaming) makes a lot more sense now.

Conclusion

There’s a lot more going on in this adapter than we expected: indeed, we think the Lightning Digital AV Adapter outputs video by using AirPlay (or similar MPEG streaming). Are we off base? Let us know!

There are a lot of questions. What OS does it boot? @jmreid thinks the adapter copies over a “mini iOS” (!) from the device and boots it in a few seconds every time it’s connected, which would explain the fairly lengthy startup time for video out. Why do this crazy thing at all? All we can figure is that the small number of Lightning pins prevented them from doing raw HDMI period, and the elegance of the adapter trumped the need for traditional video out, so someone had to think seriously out of the box. Or maybe they want get as much functionality out of the iPad as possible to reduce cost and complexity.

The bad news? By streaming internally, the quality is poor, and it’s not 1080p. We long for raw, untouched HDMI-out.

The good news? If someone complains that this insignificant plug costs $50, tell them it’s a tiny computer!

UPDATE 3/2: This anonymous comment — if you believe it — confirms nearly all of our theories and adds much-needed backstory. Very interesting! Thanks, whoever you are. Our nerd-brains appreciate it.

PS: If you’re wondering why we’re obsessed with clean iOS video out, we’ll post some status on that soon!
Posted at 3:57 pm 188 Comments

From the desk of Cabel
Portland, Oregon 97205

Coda and Sandboxing

Before we can add new features to Coda 2 in the Mac App Store, we must first “Sandbox” it — adhere to a set of Apple guidelines aimed at increasing the security of Mac OS X.

What does this mean, really?

Well, for safety, sandboxing limits an app’s access to your local files until you give the app explicit permission to interact with those files. And once you’ve done this, your permission is remembered in the future. In other words, Coda won’t be able to see most of your local folders until you specifically select them in a traditional “Choose” dialog. The good news? Coda has Sites, and Sites have a Local Path, and once you “Choose” the Local Path when setting up your site, you’ll be able to view that folder and interact with it in the future. The bad news? You’ve got to reset all of your Local Paths, and if you don’t use Sites in Coda (which would be a bit weird) there will be brief bumps.

These changes should only affect the Mac App Store version. And we think most users won’t even notice that anything has changed.

Here’s the full list of what will change, slated for a future Coda release:

1 Local Root

Your site’s “Local Root” will have to be reset. You’ll be prompted to do this the first time you try to connect.

You only have to do this once for each of your sites!

2 Go To Folder

It will no longer be possible to “Go To” any local path by typing it in. “Go To Folder” on a Local path will now bring down a traditional “Choose” panel.

3 Path History

In the Sidebar and the Files browser, the “Path” pop-up can no longer show anything above your defined Local Root. To go above your Local Root, you’ll have to use Choose.

If you’re not working in a Site, you will land in a generic sandboxed home directory, and must Choose another folder to continue.

You only need to “Choose” a folder once!

4 Path Bar Browsers

If you click on a folder outside of your Local Root, you have to manually choose the folder via Choose panel.

You only need to “Choose” a folder once!

5 Saving Files

It’s no longer possible to Save files you don’t have write access to, and Coda is no longer able to offer an authorization dialog to permit this behavior.

This includes any files you don’t own and don’t have proper permissions to write, such as files owned by a “web” process.

This is also an App Store restriction.

6 Get Info

It’s no longer possible to change permissions of files that require Administrator/Root access from Coda’s Get Info window.

You’ll have to switch to the Finder and adjust permissions there before editing these items.

This is also an App Store restriction.

7 Places

Any Local places will be cleared during the upgrade, and will need to be recreated, once.

Note: Places are defined per computer, so they will need to be reset on each computer Coda is used on.

8 SVN and GIT

Tool paths may need to be reset depending on their location on your computer.

9 Local Shell

Coda will no longer be able to open a direct local shell/terminal. (You could always turn on Remote Login in Sharing preferences, and connect through that.)

That’s it. What do you think?

For the truly curious we’ve put together a special Coda 2 build with these changes.

Experimental

If you wish to try Coda Sandboxing Test, it’s critical you understand this build is experimental and beta-quality. You must back up your system first.

Also, you must be currently using Coda 2.0.6 or higher. And if you’re using the Mac App Store + iCloud version of Coda 2, you must first turn off iCloud Sync in your current Coda, before launching this build.

Got that? Download the build here. (50 MB .zip)

We don’t have a timeline on this release, but we’re curious to know your general thoughts on Coda 2 and Sandboxing. Once again, we do not think these changes will affect most people, but we’d love it if you could please take this survey:

Thanks for reading, and thanks for using Coda 2. We’re excited to finish sandboxing and start work on more new, awesome things!

Posted at 1:11 pm 8 Comments

From the desk of
Wade
Engineering Dept.

iTunes 11 and Colors

iTunes 11 is a radical departure from previous versions and nothing illustrates this more than the new album display mode. The headlining feature of this display is the new view style that visually matches the track listing to the album’s cover art. The result is an attractive display of textual information that seamlessly integrates with the album’s artwork.

After using iTunes for a day I wondered just how hard it would be to mimic this functionality — use a source image to create a themed image/text display.

The first step in replicating iTunes theming is obvious: getting the background color used for the track listing. This seemed easy enough, just use simple color frequency to determine the most prevalent color along the left hand side of the artwork. Doing a simple color count gives pretty good results, but looking at iTunes it was clear there was more to it than just that. I proceeded to add a bit of logic to add preference for colored backgrounds instead of just using black and white when those were the most prevalent colors. Doing this presents more interesting styles since seeing only black and white backgrounds would be a bit boring. Of course you don’t want to replace black or white if those colors really are dominant, so I made sure that the fallback color was at least 30% as common as the default black or white.

Once I started filtering black and white backgrounds my results started to get a bit closer to iTunes. After doing some more analysis I saw that iTunes also looks for borders around the artwork. So lets say you have a solid white border around the artwork picture, iTunes will remove the border and base its theming colors off the remaining interior content. I didn’t add this functionality as it was outside the scope of my simple demo application.

After the background color was determined, the next step is to find contrasting text colors. Again, the first thing I tried was simple color counting, this provides surprisingly good results but iTunes does better. If we relied only on color frequency you’d get variants of the same color for the different types of text (EG: primary, secondary, detail). So the next thing I did to improve the results were to make sure the text colors were distinct enough from each other to be considered a separate color. At this point things were really starting to look good. But what other aspects would need to be considered to ensure the text always looked good on the chosen background color? To ensure colorful text I also added a bit of code to make sure the color used for the text had a minimum saturation level. This prevents washed out colors or very light pastel colors from being used that might not give the best appearance. Now that the text had unique colors that looked good with the background, the only remaining problem was that the resulting text colors could end up lacking enough contrast with the background to be readable. So the last thing I added was a check to make sure any text color would provide enough contrast with the background to be readable. Unfortunately this requirement does cause a rare “miss” when finding text colors which then cause the default black/white colors to be used.

The end result looks something like this:

It’s not 100% identical to iTunes — sometimes it’s better! Sometimes just different — but it works pretty well overall.

You can see exactly what I did in the following Xcode demo project:


A few notes about this demo. I did very basic frequency filtering to prevent random colors from appearing as text colors. In my case I chose to ignore colors that only appear once. This threshold should be based on your input image size since smaller images won’t have as many pixels to sample from. Another processing technique that iTunes does, that I would also do if this were shipping code, is to look for compression fringing around the edges of the image. I’ve noticed a few cover art images that contain a single pixel edge of white/gray fringe that should be ignored and removed before sampling for the colors.

(Last but not least, this code was written in a few hours, and is very rough. So just in case you have thoughts about speed or optimizations, please note it was more of a thought exercise than a lesson in algorithm design. Engineer disclaimer complete.)

That being said, I hope this is somewhat interesting! It shows that with just a bit of work you too can have fancy themed designs too.

UPDATE: Thanks to Aaron Brethorst, this code is also now on GitHub.

Posted at 10:55 am 76 Comments

Coda 2.0.7 Beta 1, Cabel

December 4th, 2012

It’s minor, but we thought our deepest Coda fans could give Coda 2.0.7 a whirl.

If you’re interested, grab Coda 2.0.7b1 here (51MB).

UPDATE 12/10: The beta has ended. The app has been released for direct customers and submitted to Apple.

Notable changes: improved stability and syntax highlighting performance.

If you find issues, promptly report them thoroughly via Hive!

PS: We also recently solicited, via Twitter, testers for Transmit w/iCloud and Dropbox Favorites Sync (coming soon!), and a new Panic iPad app that’s all about Status. You should follow us!

Posted at 3:17 pm 6 Comments

From the desk of
logan
Engineering Dept.

Fun with Face Detection

Let’s face it (sorry): face detection is cool. It was a big deal when iPhoto added Faces support — the ability to automatically tag your photos with the names of your friends and family adds a personal touch. And Photo Booth and iChat gained some awesome new effects in OS X Lion that can automatically track faces in the frame to add spinning birds and lovestruck hearts and so on. While not always productively useful, face detection is a fun technique.

I’ve seen attempts at duplicating Apple’s face detection technology. (Apple is far from the first company to do it.) There are libraries on GitHub and various blog posts for doing so. But recently I realized that Apple added support for face detection in OS X Lion and iOS 5. It seemed to slip under my radar of new shiny things. Developers now have a direct link to this powerful technology on both platforms right out of the proverbial box.

Using Face Detection through Core Image

Apple’s face detection is exposed through Core Image, the super-useful image manipulation library. Two classes are important: CIDetector and CIFeature (along with its subclass, CIFaceFeature). With a little experimenting one night, I was able to get a sample app detecting faces within a static image in about 10 lines of code:

  1. // Create the image
  2. CIImage *image = [CIImage imageWithContentsOfURL:[NSURL fileURLWithPath:@"Photo.jpg"]];
  3.  
  4. // Create the face detector
  5. NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:CIDetectorAccuracyHigh, CIDetectorAccuracy, nil];
  6.  
  7. CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:options];
  8.  
  9. // Detect the faces
  10. NSArray *faces = [faceDetector featuresInImage:image];
  11.  
  12. NSLog(@"%@", faces);

Note the dictionary of options. There is only one particularly useful key: CIDetectorAccuracy. It has two possible values: CIDetectorAccuracyLow and CIDetectorAccuracyHigh. The only difference: There seems to be additional processing performed on the image in order to detect faces, but at the cost of higher CPU usage and lower performance.

In cases where you are only apply detection to a single static image, high accuracy is best. Low accuracy becomes handy when manipulating many images at once, or applying the detector to a live video stream. You see about a 2-4x improvement in render time with low accuracy, but face tracking might pick up a couple of false-positives in the background once in a while, or be unable to detect a face at an angle away from the camera as well as high accuracy could.

Now that we have an array of faces, we can find out some information about each face within the image. CIFaceFeature exposes several useful properties to determine the bounding rectangle of the face, as well as the position of each eye and the mouth.

Using these metrics, it’s then possible to draw on top of the image to mark each facial feature. What you get is a futuristic sci-fi face tracker ala the Fifth Element. Leeloo Dallas Multipass, anyone?

  1. // Create an NSImage representation of the image
  2. NSImage *drawImage = [[NSImage alloc] initWithSize:NSMakeSize([image extent].size.width, [image extent].size.height)];
  3. [drawImage addRepresentation:[NSCIImageRep imageRepWithCIImage:image]];
  4.  
  5. [drawImage lockFocus];
  6.  
  7. // Iterate the detected faces
  8. for (CIFaceFeature *face in faces) {
  9. // Get the bounding rectangle of the face
  10. CGRect bounds = face.bounds;
  11.  
  12. [[NSColor colorWithCalibratedWhite:1.0 alpha:1.0] set];
  13. [NSBezierPath strokeRect:NSRectFromCGRect(bounds)];
  14.  
  15. // Get the position of facial features
  16. if (face.hasLeftEyePosition) {
  17. CGPoint leftEyePosition = face.leftEyePosition;
  18.  
  19. [[NSColor colorWithCalibratedWhite:1.0 alpha:1.0] set];
  20. [NSBezierPath strokeRect:NSMakeRect(leftEyePosition.x - 10.0, leftEyePosition.y - 10.0, 20.0, 20.0)];
  21. }
  22.  
  23. if (face.hasRightEyePosition) {
  24. CGPoint rightEyePosition = face.rightEyePosition;
  25.  
  26. [[NSColor colorWithCalibratedWhite:1.0 alpha:1.0] set];
  27. [NSBezierPath strokeRect:NSMakeRect(rightEyePosition.x - 10.0, rightEyePosition.y - 10.0, 20.0, 20.0)];
  28. }
  29.  
  30. if (face.hasMouthPosition) {
  31. CGPoint mouthPosition = face.mouthPosition;
  32.  
  33. [[NSColor colorWithCalibratedWhite:1.0 alpha:1.0] set];
  34. [NSBezierPath strokeRect:NSMakeRect(mouthPosition.x - 10.0, mouthPosition.y - 10.0, 20.0, 20.0)];
  35. }
  36. }
  37.  
  38. [drawImage unlockFocus];

With a little more work, it’s pretty easy to apply this technique to live video from the device’s camera using AVFoundation. As you get back frames from AVFoundation, you perform face detection and modify the frame before it is displayed. But I’ll leave that as an activity for the reader. :-)

And amazingly, it even works with cats.

With a little more effort, I was able to grab the closest detected face’s region of the image, and do a simple copy-and-paste onto the other detected faces (adjusting for angle and distance, of course). Behold… Panic’s newest, most terrifying cloning technology!

Here’s a little sample app. Have fun!

Posted at 11:25 am 18 Comments