And since we built the app, we rebuilt our status board, making it twice as good! (Literally.)
No, you’re not seeing double — this time we went with two goofy screens of stuff.
It’s pretty glorious.
About The Panels
Here are some implementation notes on our board:
Traditionally Panic is quiet about how-are-we-doing data. It always feels like a possible distraction for our hard-working team. But we’re always changing, and this revenue Graph panel has been fascinating. Every day a script totals up our direct sales data, then retrieves our App Store sales data using AppFigures and their nice API. The totals get dumped into a database, and then we make that available via a simple PHP script that outputs JSON to the Status Board. That might sound tricky, but all told it took about a day of work to make happen.
Units have been especially interesting since they reveal so much about the economics of (our) iOS software, as this Graph panel shows. Although (our) iOS apps sell a respectable number of units, the revenue they bring in barely charts compared to our Mac stalwarts. So far! We’re working hard on improving our iOS apps, and trying new ideas, in order to crack the iOS market a little bit more. (Sorry this chart was pre-Status Board, which is doing well!) By the way, Graph documentation is here.
The Support team works tirelessly to fight this tide! This is just an Email panel, which ties into our IMAP server. It took about 3 minutes to set up, and has been incredibly useful to see what our support load is at a very quick glance. (On the server, each Support person shares a single “Help” IMAP account, which has folders for each support person, and a script distributes the incoming support requests round-robin style.)
Conversely, this Graph panel this is a great way to quickly see how many support responses are going out the door. (Of course, it’s not a competition — it’s just for fun.) To get accurate Sent counts, we have a script that looks at both outgoing Twitter replies, and outgoing e-mails, and totals them up per-person into JSON.
This list is using our Table panel, connecting to an HTML file on our server. (Table documentation is here.) This is an edited version to protect our secret projects, of course. A project list is always tricky, since it’s the most manually-updated thing on the board, and always runs the risk of being stale. But, it’s fun to see who’s working on what.
What version of OS X are our users using? Using StatHat, which lets you track data incredibly quickly, I added one line of code to our PHP script that handles Sparkle updates. StatHat can output to Status Board natively via the Graph panel. Boom: instant OS version graph. (Also, fascinating how people use our Mac apps during the day, and not very much on the weekend.)
This is our car2go map, so we can quickly see if there are any cars near the office that we can hijack and drive home at the end of the day. It’s totally custom — we’re using the Do-It-Yourself panel so it’s just a little web page on our server. We signed up for the car2go API and combined their data with Google maps and some nice CSS animation. If enough people are interested, we might make this available to others. (Does your city have car2go?)
This is another Do-It-Yourself panel to show everyone’s bus lines. Sometimes end-of-the-day conversations are abruptly interrupted when we notice a bus is nearby. Logan has more recently made his own TriMet panel that we like a lot.
Of course, we’re also using the stock Weather, Twitter, and RSS panels for different things. And naturally, the Clock, to show the current time in Portland, Seattle, and San Francisco. You know, for conference call scheduling.
We installed a double gang outlet in the wall, to support 2 TV’s and 2 iPad chargers. Permanent power.
We applied 3M Magnet Tape to the back of our iPads. They just stick right to the back of the display:
As people continue to build new things, our Status Board seems to change every week. Since taking these photos we’ve already added GoSquared, SNMP traffic graphs, and much more. That’s the best/worst thing about Status Board: it’s now so easy to make a cool Status Board that it’s hard to know when to stop. But hey, it’s fun!
If you’ve used Status Board to make a cool status board, send us a photo!
We were immediately taken by how awesome and important data becomes when it’s displayed in a beautiful layout in a prominent way. To glance up and see how the support inboxes were looking was priceless. It became the closest thing we had to a water cooler (second to our weird snack wall). It’s our focal point.
We thought: everybody deserves this kind of beautiful data.
So we got to work and made an iPad app. We built easy-to-use panels for anybody, to quickly see tweet counts, mailbox counts, news feeds, weather, and more. We built pro panels to display graphs and tables that can be fed with simple data feeds and be used to display virtually anything imaginable. We made it incredibly easy to set up. We made it look good sitting on your desk while you work. And (as an extra in-app purchase for professionals) we even added TV Out so you can throw this data up on your wall on a giant dedicated screen, for real.
Introducing Status Board. A brand new iPad app from Panic for displaying your data.
But this is just the beginning: our 1.0. Tell us what panels you’d like to see added. Tell us how you use it, or you’d like to use it. Let us know if you find any bugs by e-mailing us. We’ll be listening and working like crazy as always.
(Thanks to Greg, Neven, Dave, Kenichi, James, and the whole Panic crew for their excellent work on this.)
And stay tuned for a blog post about our new internal Panic Status Board on the office wall — which is now using, of course, this new app. It’s twice as amazing.
Hello! Panic, Inc., a software developer for Mac, iPhone, and iPad is seeking an Office Manager at our 15-employee headquarters in Portland, OR. A rare, non-technical Panic job!
Candidate must already live in the Portland area, and be able to start immediately. Our office is located downtown, across from Powell’s Books. This is a full-time position.
Typical job duties include:
Being on-site weekdays from 9 AM – 6 PM to answer / screen phone calls, take messages, and receive deliveries and visitors
Answering general support emails, helping users get up and running, and forwarding technical questions when necessary
Handling voicemails and, somehow, faxes
Writing checks and paying bills immediately
Coordinating occasional social and corporate events such as company dinners, talks, conferences
Recording company meetings
@answering non-technical queries via Twitter
Responding to credit card disputes and refunds
Following up with purchase orders for payment (accounts receivable)
Providing price quotes for companies interested in volume purchases
Maintaining office calendar (who’s in/out, any upcoming special events, birthdays)
Scanning receipts and verifying purchase data
Keeping the dishwasher sane
Welcoming guests & making travel reservations
Unexpected Cabel Tasks and miscellaneous errands
Technical knowledge beyond email and word processing is a definite plus, as is a sense of humor and easy-going attitude. We’d love someone who has a fondness for our products and technology in general. But being organized and reliable is critical.
In addition to base salary, Panic offers:
Medical, dental, and vision coverage after 90 days
Bi-annual profit-sharing bonuses
SEP IRA retirement plan contributions after first year
Flexible vacation policy
Reasonable, life-compatible hours
Convenient central Downtown location
Free TriMet passes and bike storage
A very beautiful and inspiring office, we think
Candidates of every race, gender, nationality, age, and orientation are encouraged to apply.
Sound good? E-mail your resume to us, (UPDATE: This position has been filled. Thanks!) and tell us about yourself. If we’re interested, we will send you additional details and possibly schedule an interview. While we won’t be able to write back to everyone, we really thank you for your interest!
We’ve been doing significant testing lately with video out using various iOS devices for an upcoming project. In doing so, we waded right in the middle of a strange video out mystery. It’s time to unravel that mystery. (Chung-chung!)
Mystery #1: 1600 × 900 Resolution, Tops
When we turn on “Video Mirroring” to send out an image through the Lightning AV Adapter, the system tells us that the maximum and optimum resolution we can do is 1600 × 900:
“Hang on, that’s not 1080p!”, you’re saying to yourself. That’s exactly what we said!
When we plug in the old Dock Connector AV Adapter, the system gives us the 1920 x 1080:
So that’s a bummer. Questionably, Apple’s iPad mini tech specs claim “up to 1080p” video out support, but we can’t figure out how that’s possible. Maybe they mean that the adapter upscales the 1600 × 900 image to 1080p?
Mystery #2: MPEG Artifacts
When you plug a device into a television, you expect a clean, crisp signal — a mirror of what you see on the screen. Right?
But not with the Lightning Digital AV Adapter:
Not exactly the cleanest text in the universe! Look at all that edge garbage. (We don’t get these artifacts with the old AV adapter.)
We thought we were going insane. This is just an AV adapter! Why are these things happening! Limited resolution. Lag. MPEG artifacts. Hang on, these are the same things we experience when we stream video from an iOS device to an Apple TV…
You got it. After some good Twitter leads, and a little digging, we had our theory:
Is the Lightning Digital AV Adapter basically a small AirPlay-like receiver?
I don’t mean AirPlay the network protocol, but rather AirPlay the video compression system. It must somehow set up a connection with the very iOS device it’s plugged into. It’s in no way passing raw HDMI out from the device, but rather presenting said stream upscaled to 1080p.
“But wait”, you might be saying. “You mean to tell me there’s enough electronics in that tiny plug to support AirPlay streaming and decoding?”
It seems unlikely, doesn’t it? So out came the hacksaw.
You would not believe how incredibly tiny those components are on the left. Smaller than anything we’ve seen, electronics-wise. What could all of those resistors be for?
Let’s flip it over:
Your eyes don’t deceive you — that tiny chip says ARM. And the H9TKNNN2GD part number on there points towards RAM — 2Gb worth.
In short: it appears the Lightning Digital AV Adapter has a SoC CPU.
So, AirPlay (or AirPlay-like MPEG streaming) makes a lot more sense now.
There’s a lot more going on in this adapter than we expected: indeed, we think the Lightning Digital AV Adapter outputs video by using AirPlay (or similar MPEG streaming). Are we off base? Let us know!
There are a lot of questions. What OS does it boot? @jmreid thinks the adapter copies over a “mini iOS” (!) from the device and boots it in a few seconds every time it’s connected, which would explain the fairly lengthy startup time for video out. Why do this crazy thing at all? All we can figure is that the small number of Lightning pins prevented them from doing raw HDMI period, and the elegance of the adapter trumped the need for traditional video out, so someone had to think seriously out of the box. Or maybe they want get as much functionality out of the iPad as possible to reduce cost and complexity.
The bad news? By streaming internally, the quality is poor, and it’s not 1080p. We long for raw, untouched HDMI-out.
The good news? If someone complains that this insignificant plug costs $50, tell them it’s a tiny computer!
UPDATE 3/2: This anonymous comment — if you believe it — confirms nearly all of our theories and adds much-needed backstory. Very interesting! Thanks, whoever you are. Our nerd-brains appreciate it.
PS: If you’re wondering why we’re obsessed with clean iOS video out, we’ll post some status on that soon!
Before we can add new features to Coda 2 in the Mac App Store, we must first “Sandbox” it — adhere to a set of Apple guidelines aimed at increasing the security of Mac OS X.
What does this mean, really?
Well, for safety, sandboxing limits an app’s access to your local files until you give the app explicit permission to interact with those files. And once you’ve done this, your permission is remembered in the future. In other words, Coda won’t be able to see most of your local folders until you specifically select them in a traditional “Choose” dialog. The good news? Coda has Sites, and Sites have a Local Path, and once you “Choose” the Local Path when setting up your site, you’ll be able to view that folder and interact with it in the future. The bad news? You’ve got to reset all of your Local Paths, and if you don’t use Sites in Coda (which would be a bit weird) there will be brief bumps.
These changes should only affect the Mac App Store version. And we think most users won’t even notice that anything has changed.
Here’s the full list of what will change, slated for a future Coda release:
1 Local Root
Your site’s “Local Root” will have to be reset. You’ll be prompted to do this the first time you try to connect.
You only have to do this once for each of your sites!
2 Go To Folder
It will no longer be possible to “Go To” any local path by typing it in. “Go To Folder” on a Local path will now bring down a traditional “Choose” panel.
3 Path History
In the Sidebar and the Files browser, the “Path” pop-up can no longer show anything above your defined Local Root. To go above your Local Root, you’ll have to use Choose.
If you’re not working in a Site, you will land in a generic sandboxed home directory, and must Choose another folder to continue.
You only need to “Choose” a folder once!
4 Path Bar Browsers
If you click on a folder outside of your Local Root, you have to manually choose the folder via Choose panel.
You only need to “Choose” a folder once!
5 Saving Files
It’s no longer possible to Save files you don’t have write access to, and Coda is no longer able to offer an authorization dialog to permit this behavior.
This includes any files you don’t own and don’t have proper permissions to write, such as files owned by a “web” process.
This is also an App Store restriction.
6 Get Info
It’s no longer possible to change permissions of files that require Administrator/Root access from Coda’s Get Info window.
You’ll have to switch to the Finder and adjust permissions there before editing these items.
This is also an App Store restriction.
Any Local places will be cleared during the upgrade, and will need to be recreated, once.
Note: Places are defined per computer, so they will need to be reset on each computer Coda is used on.
8 SVN and GIT
Tool paths may need to be reset depending on their location on your computer.
9 Local Shell
Coda will no longer be able to open a direct local shell/terminal. (You could always turn on Remote Login in Sharing preferences, and connect through that.)
That’s it. What do you think?
For the truly curious we’ve put together a special Coda 2 build with these changes.
If you wish to try Coda Sandboxing Test, it’s critical you understand this build is experimental and beta-quality. You must back up your system first.
Also, you must be currently using Coda 2.0.6 or higher. And if you’re using the Mac App Store + iCloud version of Coda 2, you must first turn off iCloud Sync in your current Coda, before launching this build.
We don’t have a timeline on this release, but we’re curious to know your general thoughts on Coda 2 and Sandboxing. Once again, we do not think these changes will affect most people, but we’d love it if you could please take this survey:
Thanks for reading, and thanks for using Coda 2. We’re excited to finish sandboxing and start work on more new, awesome things!
iTunes 11 is a radical departure from previous versions and nothing illustrates this more than the new album display mode. The headlining feature of this display is the new view style that visually matches the track listing to the album’s cover art. The result is an attractive display of textual information that seamlessly integrates with the album’s artwork.
After using iTunes for a day I wondered just how hard it would be to mimic this functionality — use a source image to create a themed image/text display.
The first step in replicating iTunes theming is obvious: getting the background color used for the track listing. This seemed easy enough, just use simple color frequency to determine the most prevalent color along the left hand side of the artwork. Doing a simple color count gives pretty good results, but looking at iTunes it was clear there was more to it than just that. I proceeded to add a bit of logic to add preference for colored backgrounds instead of just using black and white when those were the most prevalent colors. Doing this presents more interesting styles since seeing only black and white backgrounds would be a bit boring. Of course you don’t want to replace black or white if those colors really are dominant, so I made sure that the fallback color was at least 30% as common as the default black or white.
Once I started filtering black and white backgrounds my results started to get a bit closer to iTunes. After doing some more analysis I saw that iTunes also looks for borders around the artwork. So lets say you have a solid white border around the artwork picture, iTunes will remove the border and base its theming colors off the remaining interior content. I didn’t add this functionality as it was outside the scope of my simple demo application.
After the background color was determined, the next step is to find contrasting text colors. Again, the first thing I tried was simple color counting, this provides surprisingly good results but iTunes does better. If we relied only on color frequency you’d get variants of the same color for the different types of text (EG: primary, secondary, detail). So the next thing I did to improve the results were to make sure the text colors were distinct enough from each other to be considered a separate color. At this point things were really starting to look good. But what other aspects would need to be considered to ensure the text always looked good on the chosen background color? To ensure colorful text I also added a bit of code to make sure the color used for the text had a minimum saturation level. This prevents washed out colors or very light pastel colors from being used that might not give the best appearance. Now that the text had unique colors that looked good with the background, the only remaining problem was that the resulting text colors could end up lacking enough contrast with the background to be readable. So the last thing I added was a check to make sure any text color would provide enough contrast with the background to be readable. Unfortunately this requirement does cause a rare “miss” when finding text colors which then cause the default black/white colors to be used.
The end result looks something like this:
It’s not 100% identical to iTunes — sometimes it’s better! Sometimes just different — but it works pretty well overall.
You can see exactly what I did in the following Xcode demo project:
A few notes about this demo. I did very basic frequency filtering to prevent random colors from appearing as text colors. In my case I chose to ignore colors that only appear once. This threshold should be based on your input image size since smaller images won’t have as many pixels to sample from. Another processing technique that iTunes does, that I would also do if this were shipping code, is to look for compression fringing around the edges of the image. I’ve noticed a few cover art images that contain a single pixel edge of white/gray fringe that should be ignored and removed before sampling for the colors.
(Last but not least, this code was written in a few hours, and is very rough. So just in case you have thoughts about speed or optimizations, please note it was more of a thought exercise than a lesson in algorithm design. Engineer disclaimer complete.)
That being said, I hope this is somewhat interesting! It shows that with just a bit of work you too can have fancy themed designs too.
Let’s face it (sorry): face detection is cool. It was a big deal when iPhoto added Faces support — the ability to automatically tag your photos with the names of your friends and family adds a personal touch. And Photo Booth and iChat gained some awesome new effects in OS X Lion that can automatically track faces in the frame to add spinning birds and lovestruck hearts and so on. While not always productively useful, face detection is a fun technique.
I’ve seen attempts at duplicating Apple’s face detection technology. (Apple is far from the first company to do it.) There are libraries on GitHub and various blog posts for doing so. But recently I realized that Apple added support for face detection in OS X Lion and iOS 5. It seemed to slip under my radar of new shiny things. Developers now have a direct link to this powerful technology on both platforms right out of the proverbial box.
Using Face Detection through Core Image
Apple’s face detection is exposed through Core Image, the super-useful image manipulation library. Two classes are important: CIDetector and CIFeature (along with its subclass, CIFaceFeature). With a little experimenting one night, I was able to get a sample app detecting faces within a static image in about 10 lines of code:
Note the dictionary of options. There is only one particularly useful key: CIDetectorAccuracy. It has two possible values: CIDetectorAccuracyLow and CIDetectorAccuracyHigh. The only difference: There seems to be additional processing performed on the image in order to detect faces, but at the cost of higher CPU usage and lower performance.
In cases where you are only apply detection to a single static image, high accuracy is best. Low accuracy becomes handy when manipulating many images at once, or applying the detector to a live video stream. You see about a 2-4x improvement in render time with low accuracy, but face tracking might pick up a couple of false-positives in the background once in a while, or be unable to detect a face at an angle away from the camera as well as high accuracy could.
Now that we have an array of faces, we can find out some information about each face within the image. CIFaceFeature exposes several useful properties to determine the bounding rectangle of the face, as well as the position of each eye and the mouth.
Using these metrics, it’s then possible to draw on top of the image to mark each facial feature. What you get is a futuristic sci-fi face tracker ala the Fifth Element. Leeloo Dallas Multipass, anyone?
With a little more work, it’s pretty easy to apply this technique to live video from the device’s camera using AVFoundation. As you get back frames from AVFoundation, you perform face detection and modify the frame before it is displayed. But I’ll leave that as an activity for the reader.
And amazingly, it even works with cats.
With a little more effort, I was able to grab the closest detected face’s region of the image, and do a simple copy-and-paste onto the other detected faces (adjusting for angle and distance, of course). Behold… Panic’s newest, most terrifying cloning technology!