Mimicking reality in a user interface isn’t exactly a new idea (skeuomorphism is kind of dated, after all). But a former Apple engineer has created a refreshing new take on a familiar system.
Bob Burrough, who worked under both Steve Jobs and Tim Cook at Apple, has come up with a new platform called Project Erasmus. The system itself is kind of like True Tone in that it shifts based on the environment. But that’s really where the similarities end.
Project Erasmus is a UI implementation that uses the lighting in your environment to illuminate, shade or reflect software features on the device.
To show off the feature, Burrough created a simple proof-of-concept app shown below.
“It looks like the user-interface elements are physical objects that reside just beneath the surface of the screen, like you could reach in and touch them,” Burrough said in the video.
The app works thanks to a fish-eye-lens camera, which Burrough attached to the top of an iPhone. That camera allows the device to capture a wider-angle shot of its surroundings.
That environment data is then analyzed and mapped onto a digital scene within the app, which is then used to construct lighting and reflections that can impact UI elements on the device.
- For example, after walking into a dark room, the app would transition simultaneously giving it the appearance of an automatic ‘Dark Mode’.
- On the other hand, a brightly lit environment would illuminate the app and create shadows based on the direction the light is coming from for ultimate immersion.
The concept app is just one example of how the system could be used and Burrough notes that each developer can tailor how an interface reacts to ambient lighting. The end result is a system that can provide an “unprecedented level of immersion.”
The former Apple engineer also published another video that explores how the feature could be used on an Apple Watch and in a standard web browser.
While the feature shown off in Burrough’s video is mostly aesthetic, there are some obvious, practical use applications for the app — such as immersive lighting effects in augmented reality or within mobile games.
But, of course, it’s also just kind of cool. Not all software features need to be practical. Just look at the parallax effect in iOS 7. Combined with the potential practical uses in AR and mobile games, the feature is definitely one Apple should consider implementing in iOS in the near future.
This article was originally posted here