Platform Snapdragon AR2
At the latest Snapdragon Summit, Qualcomm announced its first purpose-built augmented reality platform: Snapdragon AR2 Gen 1. VR is immersive, but Qualcomm believes the future of spatial computing looks like the evolution of AR because it allows Combination of physical and virtual You can wear glasses all day long. But it has the benefit of having more information and displaying it privately.
One of the biggest challenges in this regard is to pack all the essential features and functions into a small package that is not only comfortable to wear, but functional. But it also doesn’t look like a glass brick strapped to the face with velcro straps. Power and battery life Keeping the processor cool Sufficient field of vision, ease of use, display, comfort, weight, physical layout. and wiring of components Trying to fit it all into a small package is a big deal.
In the past, Qualcomm has released several XR platforms, which have been used as the base platform and development kit for several successful VR headsets. ‘ where AR will rely heavily on AI technology, which was introduced in the Snapdragon 8 Gen 2 launch for better edge computing. It is also optimized for distributed processing. distribute the load among the chips to work simultaneously Instead of a monolithic chip
This distributed approach is necessary for many reasons. A single monolithic chip occupies more space than a single one. and has much higher cooling requirements However, these localized hotspots are at odds with what AR needs to be embraced more widely: sleeker AR glasses.
The AR2 platform will have three chips: AR processor, coprocessor. and chip connection A lot of heavy lifting still requires a connected device, such as a phone or laptop. So these are not standalone devices. Part of the sacrifice to create a sleeker AR experience.
An AR processor sits in one arm of the glasses and feeds important information back to the connected device. This would be things like 6DoF (motion tracking) spatial data. (object contextual highlighting and tracking) and rendering tools.
The co-processor sits in the eyeglass bridge. and used for sensory integration, AI, and computer vision. The connection chip uses the same FastConnect system as the SD8gen2 platform and is placed on the other arm of the goggles along with the battery. And it runs a low-latency Wi-Fi 7 connection to stream data to and from the host.
Interestingly, through more technical in-depth, AR processors are built on the 4nm process and share many architectural design philosophies from mobile processors. It has its own memory, Spectra ISP, CPU, Adreno display, sensing hub, Hexagon and Adreno processors, all just rearranged and scaled differently. New features are reprojection and image analysis tools.
The image analyzer works with 6DoF tracking to mark important points of interest as frames of reference. so that tracking remains accurate. Implemented in hardware instead of software to speed up motion tracking and eliminate motion latency. which is the main cause of motion sickness with these settings.
The reprojection engine does the heavy lifting of mapping virtual objects to the real world using trace information. This is responsible for keeping virtual objects stationary compared to the real world. for toys and virtual displays to be anchored to tangible objects even if you move your head around.
The coprocessor is designed to distribute some tracking sensor and camera connections away from the central processor. To be able to move the wires and thermal footprint off one large bridges or thick arms on glasses that don’t fold flat. Another benefit is that it reduces eye tracking and iris identification from the main CPU, reducing power consumption. without needing to wake up the main CPU for simple tasks
The split between the AR processor (10mm x 12mm) and the coprocessor (4.2mm x 6.2mm) reduces PCB area by 40% and the number of wires to be routed by 45% compared to conventional AR processors. Single monolithic chip and PCB result in thinner, more stylish glasses.
Another major advantage of this AR2 platform is its power requirements. At less than 1 watt of power (if you exclude display power requirements), the AR2 consumes 50% less power than the previous XR2 platform.
From a performance standpoint, the massive AI focus results in hand and subject tracking 2.5 times faster than the XR2. Faster tracking means less processing time. and thus using less energy as well
A large number of developers and OEMs have already signed up for AR2, including familiar names such as Lenovo, LG, OPPO, and others.
snap dragon computer
Qualcomm is at the forefront of Arm-based laptops running Microsoft Windows and is the primary partner for Microsoft Surface devices. You can have a laptop with 20 hours of run time for long trips. or whole month standby time
What sets these devices apart from previous Arm-based Windows efforts is the inclusion of x86 compatibility mode in the hardware. Although not very effective. But at least it gets most 32-bit x86 applications working. Even if it’s a bit slow, even if x86 is slow, at least there are some stop-gap measures to support the platform’s adoption. and from that point of view considered to be doing well
The problem prompted developers to rebuild their software with native Arm APIs for Windows and allow native performance for core applications. One of the key players in this regard is Adobe, as it was quite slow to adopt Arm despite being a launch partner in 2019. It initially released native versions of Photoshop and Lightroom. Still, the native version of Photoshop performs less than optimally. (politely) At the Snapdragon Summit, Adobe announced its continued support and promised native support for Arm in its software in the future, introducing native versions of Fresco and Acrobat. Arm’s software, which will launch in 2023.
Qualcomm also wanted to talk about some of the changes coming to the laptop, including on-device AI and its significance. We’ve gone through a phase of GPU-based hardware acceleration, taking advantage of a lot of its parallel nature. But in the past few years That focus has moved more to AI with Tensor cores.
A lot of the AI in the past few years has involved gaming and video enhancement through massive GPUs. Most of the work is done upfront through AI training with cloud computing and big data centers. Not. has used a lot of AI on the device The audio-to-text system uses mostly plain old algorithms. The only AI involved is after the text is sent to the data center where natural language engines come in to try and infer what you say and mean.
What takes time is practice and applies to what we do every day. One aspect that many people accept and even demand is working from home and the dreaded big meetings, messy backgrounds, random noise from people’s microphones. Telling people to mute themselves when not joining a call is a common piece of advice that few people follow.
We’ve seen NVIDIA Broadcast in action and what it can do to eliminate background noise on the microphone. And that’s the power of AI, blurring the background to put faces in the spotlight. And replacing the background without a green screen is possible with AI. We now have automatic framing and cropping. motion tracking Keeping you in frame and in focus Brightness and color correction for dark rooms And now real-time multilingual translation with transcripts and subtitles. All of this thanks to AI. We can definitely do these things before AI, but it’s computationally expensive. AI makes things much faster, to the point where we can do them all at the same time. but uses only a fraction of the energy This is what Qualcomm wants to bring to the next generation of Snapdragon for Windows.
Part of this endeavor is the announcement of Qualcomm’s next CPU project called Oryon (pronounced Orion)… and it will be released in 2023 along with details on what this CPU will do.
Snapdragon Sound and surround sound
Another market that Qualcomm is a part of is audio technology. Mostly focused on Bluetooth and wireless audio, at the Snapdragon Summit 2022, Qualcomm wanted to showcase its advancements in spatial audio. which is a 3D sound track and sound landscape.
This is an effect that other companies Many have experimented over the past few years. Mostly used with VR headsets, moving the head back and forth causes sound to pass through the left and right speakers at different levels, giving the impression that the sound is coming from a fixed point in space. According to Qualcomm research, 41% of respondents want this feature and are willing to pay more for it. This feature will come as part of the launch of the Snapdragon 8 Gen 2 and the Snapdragon Sound Platform S5 and S3 Gen 2.
This spatial tracking also requires low latency. And doing so wirelessly can be very challenging. Specifically via Bluetooth, Qualcomm has managed to reduce the latency on its audio platform to 48ms from last year’s 89ms, which when you compare it to the <8ms people expect from displays (120 FPS+) seems pretty high. In general, our ears are less sensitive to sound delay than vision. Fortunately, however, the high response time of audio can be very distracting. Especially in movies and games. So any method That reduces the response time is greatly appreciated.
One item on the Snapdragon Sound platform that is likely to interest some folks is lossless audio support for both the Bluetooth Classic and LE standards. new earbuds
Additionally, new ripple devices will be announced soon, such as bringing aptX audio to speakers and premium features such as lossless audio. to budget mid-range headphones So you probably won’t have to spend a lot of money to buy lossless audio.
Adaptive Noise Canceling (ANC) is being updated. And you may start to see support on more devices too. but also better sound detection So you can still talk with ANC enabled if you want.
It also supports the new Auracast Broadcast audio system, which can stream Bluetooth audio to multiple connected devices simultaneously.
With all that said, this wraps up the Snapdragon Summit and what we can expect to see in the new year. Be sure to also check out our Qualcomm Day One article to see what’s new in the mobile sector.
Support our efforts! With record low ad revenue for writing sites. We rely more than ever on the support of our readers to help us continue our efforts with this kind of content. You can support us by becoming a patron or by using the Amazon shopping affiliate links listed through our articles. Thank you for your support!