![]() ![]() A highly interesting thread from Robert Scoble makes a host of predictions concerning these plans, but those cameras will likely be used to analyze and augment the reality you are in, as well as deliver virtual experiences you can safely explore.Ī human wearing a set of AR glasses will rely on a similar set of technologies as a vehicle making use of machine vision intelligence to drive itself on the public highway. We think Apple is working towards eventual introduction of its take on AR glasses, which may be equipped with an array of as many as eight cameras - possibly due early next year and quite probably from Sony, which I believe has worked with Apple on and off on these projects for years. Industry is also embracing smartphone-quality imaging intelligence - factories make use of fault detection systems mounted on iPads and other devices to monitor production to maintain quality control, for example, and AR-based retail experiences continue to improve.īut what we know about today’s use cases must be considered alongside the consequences of those only now coming onstream, particularly around autonomy and augmented reality. I’ve written before about Triton Sponge, which enables surgeons to more accurately track patient blood loss during surgery, and I think everyone now understands how camera intelligence combined with augmented reality can optimize performance across distribution, warehousing, and logistics chains. These predictions make it clear that Apple’s Cinematic Mode is a stalking horse from which to exploit the future evolutions of smartphone camera sensors.īut this kind of machine vision intelligence is just the consumer front end to far more complex operations that should translate into interesting enterprise opportunities. ![]() I think they will also extend to true 3D imaging capture sufficient to support truly immersive 3D experiences. These will inevitably include zoom effects boosted by AI and Super HDR and should also extend to 8K video capture on smartphones. This will enable “new imaging experiences,” Sony said. Sony, which holds 42% of the global image sensor market for phones and has three of its highest-end IMX sensors inside the iPhone 13, believes sensors in high-end devices will double in size by 2024. Apple’s efforts arguably accelerated with the iPhone 7 Plus, which included multiple lenses and zoom functionality for the first time in a smartphone. Think about the Photos app, built-in document scanning, AI-driven person recognition, machine-driven text identification and translation, and, most recently, the capacity to identify images of flowers and animals using Visual Lookup. It has also built out a wider ecosystem to support this use. Throughout the history of the iPhone, Apple has focused on use of the device as a camera. This statement comes as pro photographers (and video makers) make increasing use of iPhones for professional work - but also as machine vision intelligence reaches a tipping point to enable enterprise and industrial applications. ![]() “We expect that still images will exceed the image quality of single-lens reflex cameras within the next few years," said Terushi Shimizu, President and CEO of Sony Semiconductor Solutions (SSS). Smartphone cameras will deliver better quality images than you get from DSLRs within three years. Apple’s decision to invest in iPhone photography was incredibly shrewd. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |