Contents

Design Inspiration: Second Screen Designs

Design Inspiration

Second screen designs have the potential to revolutionize many types of applications.  And to date, app developers have barely scratched the surface of that enormous potential.  In this, my first article in a series of articles on design that I’m calling Design Inspiration, I begin to explore some of the current and the future design ideas for second screen applications that have yet to be built by anyone.

What is Second Screen ?

Second screen refers to apps that are designed to extend or enhance a User eXperience (UX) by using a second device interactively connected to a primary device.  To date most second screen implementations have been very limited, focused mostly on providing second screen features to a television as the primary screen.  For example, Wikipedia’s current definition of second screen is only focused on these types of implementations.

“The second screen refers to the use of a computing device (commonly a mobile device, such as a tablet or smartphone) to provide an enhanced viewing experience for content on another device, such as a television. In particular, the term commonly refers to the use of such devices to provide interactive features during “linear” content, such as a television program, served within a special app.”

But these implementations of second screen designs are extremely limited and I think there is a great deal more opportunity that no one is taking advantage of yet.

Current Second Screen Designs

Current second screen implementations are designed around the television.  They include things like:

  • Television program based apps.
  • Guide and TV program discovery apps.
  • Mirroring screens and remote casting.
  • Video game console based apps.

Television program based apps

Television program based apps run on smartphones, tablets or web sites that provide extended content for television shows and movies.  The most innovation that we see in this type of app is maybe real time synchronization with the program.  But even that does not add much value and the user experiences are pretty bad.


Guide and TV program discovery apps

These apps provide guides to what programs are available on broadcast television or via on demand services.  Most implementations don’t do anything more useful than provide an enhanced electronic version of a paper based TV guide.


But NetFlix has an interesting and useful implementation of second screen interactivity with their streaming service, that just begins to show the potential that second screen designs can offer.  The second screen mobile app can control the streaming on a primary screen.  Effectively turning your mobile device into a sophisticated remote control for your TV.


I’ve used this myself a lot using the NetFlix app on my iPad to control the stream playing on the NetFlix app on my PlayStation 3.  This is really handy.  You can not only manage and play programs on your iPad, but also send the stream to play on your PlayStation.  And then control the playing of the stream from the iPad.  It’s very cool.  Sadly this same functionality does not work if you use a Mac or PC connected to your TV, using a web browser.  NetFlix needs to implement this real time interactivity in their app across all of the platforms they support.


Here’s a NetFlix doc on this second screen capability.  NetFlix does support an enormous number of devices – a huge competitive advantage for them.  So this feature may be supported on other primary screen based devices, but I’ve only used the iPad and PlayStation combination.  However, that  NetFlix doc on second screen seems to imply that only their app on PlayStation supports this right now.  For more on NetFlix device support check out the Watching NetFlix section of their Help Center.


Mirroring screens and remote casting

Apple TV User Interface
Apple TV User Interface

This is simply mirroring a device’s screen on another device.  Apple’s AirPlay mirroring on Apple TV and Google’s Chromecast are examples.  Although not a terribly innovative second screen design, it is very useful.


Video game console based apps

Now here’s where second screen designs start to get interesting.  These implementations use a mobile device as an fully interactive second screen to a video game’s primary screen.


Apple TV Dual Screen Gaming
Apple TV Dual Screen Gaming

The Dual screen capability of Apple TV is an excellent example of this type of second screen design.  In this scenario the game app runs on an iPhone or iPad which displays a second screen UI with controls, info and other supporting functionality.  While it simultaneously broadcasts a primary game screen with the action to an Apple TV device.  This is wicked cool gaming and is the future of how games should be.  But I think Apple may do even more with the next generation Apple TV.


Sony is starting to do some interesting things in this area with their PlayStation 4 app.  And so is Microsoft with Xbox SmartGlass.  But I think Apple has the lead in second screen gaming because the tech has been out longer and is more robust.  Not to mention that mobile device gaming has exploded and has taken a big bite out of the console gaming markets like the Xbox and PlayStation.


Second Screen Design Opportunities

Today’s current second screen implementations barely scratch the surface of the potential of second screen.  Because so far they have limited themselves to the television stuck in your living room.  The big opportunity is in second screen designs based entirely on mobile technologies.


Car based Second Screen Designs

Apple has done something interesting in the car since they announced the iOS in the Car initiative at Apple’s Worldwide Developers Conference (WWDC) last June 2013.  Although not a second screen, Siri Eyes Free is just the beginning of what I think Apple is going to do and could do in the car.


Imagine your smartphone or tablet sending a second screen to an in car display that can control any app.  And via that in car display system, integrate with other in car systems like speakers, microphones and control and monitoring systems.  And combined with Eyes Free functionality.  NOW we’re talkin’ !!!!


Imagine control over navigation apps, music, texting, video and more via a larger display mounted in your car which has a touch screen to control the apps.  And can also be controlled via the voice recognition of Siri Eyes Free.  The in car system could link wirelessly via bluetooth to your phone or tablet.  And the mobile device would also provide Internet access to the car systems, eliminating the need for separate and expensive cellular plans for your car’s system.


Automatic Link Device
Automatic Link Device

Now add integration with the car’s monitoring and control systems.  Providing the type of connection and features that Automatic provides today.  Automatic provides a device that connects to your car’s existing data port, to connect your mobile device to the car’s on board computer via bluetooth.  An app can then do all sorts of interesting and useful things.  Now imagine having that connection provided by the car’s built in second screen system.


The possibilities are endless….


Wearable Tech Designs

Wearable tech could take the form of wrist watches, glasses, wrist or arm bands, or any form factor that you wear full time.  Wearable tech is barely in it’s infancy right now.  And no one has released a wearable device that taps the enormous potential of wearables.


Wearables, almost by definition, are a second screen device.  Although they certainly can and will be capable of running their own apps.  I think their enormous potential will be accomplished via second screen designs.  Designs that integrate the wearable with a power of the smartphone.


Samsung Gear
Samsung Gear

I think the reason that no one has been able to build a wearable that is really successful in the marketplace is because of three primary reasons:

  1. Make the hardware work well.
  2. Make it fashionable.
  3. Design software with a revolutionary second screen user experience.

This is very hard stuff.  That’s why Samsung has not had any real success with their smart watches and why Google Glass

Google Glass
Google Glass

has been in beta for ages and why it’s inspired the term Glasshole.  While other entries like the Pebble are admirable attempts, they will have a great challenge competing against the depth of technology and pockets of the likes of Apple and Google.


Pebble Smartwatch
Pebble Smartwatch

They are all just throwing stuff on the wall and hoping something sticks – and so far, nothing has.  And also why Apple has not announced anything – but based on the many rumors, they seem to be clearly working on perfecting these wearable designs.  In typical Apple style, as they did with the iPod, iPhone, iPad and more, they won’t be the first but instead spend the time and take the care to invent something that no one else can.


Price points will also be important.  The Samsung and Pebble devices currently go for between $150 and $299.  So they are in the price range that a device has to be, to go mainstream.  But they need to be much more functional than they are now.  Google Glass goes for a whopping $1,500, so it has absolutely no chance of being a mainstream device, especially with it’s very limited capabilities and bad reputation.


Make the hardware work well

This is about small size, long battery live, usable interfaces.  No one has been able to crack this first nut.  Apple seems to be on the right track, by paying close attention to the many iWatch related rumors about curved glass, flexible glass, their investments in sapphire glass, new hires in sensor, biometrics and health technologies and more.  Meanwhile, so far Samsung and Google are floundering.


Make it fashionable

Concept design of rumored Apple iWatch
Concept design of rumored Apple iWatch

Samsung and Google clearly don’t get this at all.  Samsung’s Gear watches and  Google Glass are the definition of geeky.  Meanwhile Apple has made the right moves in this area with new fashion related hires and comments about the importance of fashion with wearables by Tim Cook himself.


Design software with a revolutionary second screen user experience

So far the software for wearable tech is pretty simple.  Really REALLY simple.  No one has yet come up with a killer, “I gotta have it” app or set of features that we can’t do without.  So far it’s just simple extensions of what our mobile devices already do, but dumbed down to be able to work on such a tiny device that must be light enough to wear.  Things like notifications, music, pitiful low res camera (Samsung), some health related features, very low quality displays (Pebble), etc.


Wearable based Second Screen Design Ideas

Here’s some design ideas that could make wearables a must have product:


Extend navigation apps

Extend navigation apps for use while driving, walking and biking. No need to have mounts for your car or bike when the information you need most is on the screen on your wrist. While the app runs on your phone or tablet stowed away in your backpack, bike mounted pack or pocket.




S
ophisticated health monitoring sensors

Go beyond movement tracking and heart rate.  Integrate even more bio info into a device.  Apple has been making some serious moves in this direction with several new hires in sensor, biometrics and health technologies based on the many iWatch rumors.  Apple has already laid the foundation for this move with the M7 motion coprocessor in the iPhone 5s.


Health Central App

Apple has been rumored to be working on a Healthbook app for iOS 8.  Healthbook would be similar to Passbook and be focused on bringing health related data from multiple sources, together in to a single app.  If extended to support second screen features of wearables and integrate health sensors in wearables, it could provide a compelling selling point for wearable devices.


Mobile Payment Features

Being able to make payments with your wearable tech on your wrist, without taking out your smartphone or wallet, could be a handy feature.  Tech companies have been trying to make Near Field Communication (NFC) tech for wireless payments useful without much success.


Apple has resisted integrating NFC because it so far has not presented a valuable feature for customers because of it’s immaturity.  About a year ago Tim Cook said that mobile payments were “just getting started” and still “in its infancy” and before that Phil Schiller said “It’s not clear that NFC is the solution to any current problem”.  But recently NFC has come up again in the Apple rumor mill, so perhaps it’s about time to see NFC appear in the new iPhone and the rumored iWatch.


Wearable and Health App Ecosystem

There really are only 2 viable and successful app ecosystems in the mobile space:  Apple’s iOS and Android.  And the advantage of a well established and mature ecosystem is immeasurable.  Besides the technology and infrastructure available, the vast amount of experienced developers and apps that can be leveraged in wearables immediately, is mind blowing.


Extending Screens and Seamless Sharing

Tablets, smartphones and computers could be used as extensions of each other, and additional peripherals that just work, seamlessly.  Some simple examples that exists right now are apps from Avatron and Splashtop that turn the iPad into a second display for your Mac or Windows computer, mirror your iOS device to your Mac.

But there are lots of opportunities to make tablets, smartphones and computers work even better together.  Here are a few ideas:

iPad as Touch Screen Tablet

Use an iPad as a touch screen tablet device with a computer.  The iPad displays what’s on your computer screen like a mirrored or extended display.  And you can use your finger or a stylus to draw directly on the iPad screen as input directly into your app running on your computer.  This is not just remote desktop, or mirrored or extended displays, because it requires better integration then those types of apps.

Wacom and Viewsonic have been offering dedicated hardware that will display an image on a screen and you use a stylus to draw directly on the screen of the device and see your changes in real time on the screen you are drawing on.  But they are expensive, some models very expensive.

Meanwhile a iPad has all the hardware to do this quite well.  All it needs is some software.

Smartphone and Tablet Camera as Computer Camera

Your smartphone’s camera becomes a camera device integrated in real time in to your computer.  It acts like a directly connected webcam and still camera.  It can capture real time video and still images directly into any app on your computer, just like it were directly connected.  And an app on your computer can remotely control all of the camera functions.  All of this wirelessly, fast and in real time.  Or clouds, no syncing, no delays.

All Smartphone/Tablet Devices as Computer Devices

Extend the camera ideas to all the devices that make sense, on your smartphone or tablet, to act like devices directly connected to your computer.  And vice a versa.

Storage

Smartphone/tablet storage as drives you can mount on your computer.  Native mobile OS support for mounting remote drives as easily as you do it on your computer.

There are a lot of third party apps that can provide similar functionality, but they are often awkward or not integrated in to the mobile OS.  The mobile OSes need to have more computer like file system functionality built in to the OS, so the devices can have central file systems, file managers and network file sharing protocol support.

Audio
There might be value in allowing audio hardware to be seamlessly shared across devices.  Simple switching wireless audio equipment between devices, like bluetooth headphones.

App Functionality Sharing
Place and receive phone calls on your computer.  Seamlessly use your phone to deal with VoIP calls you now get on your computer only.

Pass a work in progress between devices without having to use a cloud service to get the file or work from 1 device to another.  There’s got to be a better way then saving a file to a cloud, use a file transfer tool or email and then manually getting the file on the other device and then opening it.  That’s pretty crude considering how powerful our mobile devices are now.


Mobile devices should be very sophisticated remote controls and extended user experiences for our smart TVs.  The apps on all devices should be aware of each other and seamlessly integrate their app functionality together to create a dramatically enhanced user experience.  Right now our “smart” TVs are still pretty dumb, and our mobile device are totally clueless about the TV and it’s gigantic entertainment sources.



Conclusions

Take aways from this first edition of Design Inspiration are:

  1. Think in terms of second screen design, if your apps offer UX designs that are candidates like entertainment, car tech and wearables.
  2. Even very simple uses of second screen techniques like notifications, are candidates.  So most every app developer should be thinking about how to leverage the second screen.
  3. Start designing wearable user experiences now.  Figure out how your UX can benefit from wearables, when they inevitably take off – once Apple perfects a wearable product.
Need an Agile Expert ?

Looking for an expert in Agile Coaching, developing & leading Agile transformations, Agile tools, DevOps strategy and Scrum ?

Send me, Ken Adams, a message on LinkedIn and let’s talk.

Subscribe

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Discover more from AdamsAndBits.com

Subscribe now to keep reading and get access to the full archive.

Continue reading