Sony’s detention its Technology Day event to show what she’s working on in her R&D labs, and this year we’ve gotten some great visuals of the technology the company is working on. In the middle of PS5 haptics and 3D audio redesigns and one demo reel from Sony’s admittedly impressive screens for creating virtual film sets, we got to see a robotic hand that Sony said could determine grip strength based on what it was picking up, a “global sensing system” slightly dystopian and more.
Perhaps the most interesting thing Sony presented was a headset with OLED displays with a resolution of “4K per inch”. While the Sony headset used in its presentation was very clearly something intended for laboratory and prototype use, the specs that Sony presented for the panels were reminiscent of rumors revolving around the PlayStation VR 2.
They don’t line up exactly, however; Sony said the featured headset is 8K, given the 4K per-eye display, and the PS VR 2 will only be 4K overall with 2000 x 2040 pixels per eye. Still, it’s exciting that Sony is working on panels focused on virtual reality, as well as latency reduction technology for them. In a question-and-answer session for reporters, Sony did not answer questions about when the screens would appear in an actual product, but said various divisions are already looking at how they could incorporate them into products.
Sony also introduced a robotic gripper, a mechanized gripper that could be used to enable a machine to pick up objects. While this isn’t anything new, Sony said their version had the ability to precisely control grip strength depending on what she was holding, allowing her to hold things firmly enough that they wouldn’t slip out. (and adjust if they start to fall off) without crushing delicate items like a vegetable or flower.
Sony says the grappling hook could be used for cooking or lining up items in a display case, but to achieve this level of functionality it would have to be paired with a way to move and an AI that allows it to determine which items to pick up. . It’s hard to imagine that we’ll be seeing this sort of thing in public anytime soon, but it makes for an impressive demo. It’s also good that Sony made it look very robotic rather than making it look like a hand – I’ve had my fix of weirdly human robot parts for this month.
Sony’s presentation also featured some polished visuals to accompany other projects, as well as another look at some devices we’ve seen before. He showcased machine-learning oversampling technology (similar to Nvidia’s DLSS), which he said he could use to improve the resolution and performance of ray-traced rendering. A GIF of the Sony comparison photos would ruin the quality, but you can check it out in the video below.
Finally, Sony talked about its “Mimamori” system, which it says is designed to monitor the planet. As I watched the presentation in real time, I started to worry a bit when I saw this slide:
Sony then explained that their idea was to use satellites to collect data from sensors placed all over the earth, designed to collect information on soil moisture, temperature, etc. His argument is that it could help scientists gather information on changing climate and help farmers adapt to those changes. Mimamori seemed to be closer to a pitch than a prototype, but it shows Sony is at least studying how it can leverage some of its technologies to help deal with climate change.
While some of the ideas Sony showcases seem like moonshots, it’s interesting to get a glimpse of what Sony is doing in its labs, beyond creating PlayStation and Spider-Verse sequels. While we don’t end up releasing consumer devices out of it, at least we have some pretty cool futuristic visuals.