Menu Close

“We have to meet all of their expectations” Sony talks diverse markets, AI and global shutters

Masanori Kishi, head of the Lens Technology and Systems business unit and Masaaki Oshima, head of Sony’s Imaging Entertainment business.

Photo: Richard Butler

“Compared to the past five years, the range of creators has become more diverse,” says Masaaki Oshima, head of Sony’s Imaging Entertainment business: “so we have to meet all of their expectations, from the still shooter to the movie shooter, from the older user to the young, male and female.”

We spoke to Oshima and Masanori Kishi, the head of the Lens Technology and Systems business unit at the CP+ show earlier this year. They had some interesting things to say about the changing nature of the market, the role they expect AI to play and the importance of global shutter sensors.

Oshima seems to think there’s still reason to be hopeful about the state of the market. “We’d thought the total market volume would shrink, but in fact it’s been very stable,” he says: “and we assume the market volume will continue to be stable in 2024. We believe the total market for mirrorless and full-frame will increase.”

“We think the individual customer relationship will be the key”

But companies will need to work harder to satisfy the many different types of users, he says: “We think the individual customer relationship will be the key to helping them enjoy our imaging experience. Customizing the marketing to the individuals will be the key, I think, not only the cameras’ functions.”

He also highlights some of the work Sony has been doing to increase accessibility, through features that audibly describe the on-screen options and its retinal projection kit. “We’re not only targeting existing creators but also the people that want to be creators,” he says.

Kishi says he believes Sony’s commitment to a single lens mount will help the company satisfy a broad audience. “We have a diverse range of bodies but all using one mount,” he says. “This is a very good concept, I think, so we can reach various kinds of customers.”

This means a lot of lens development work, he explains: “From the lens point-of-view we need to support all these various kinds of customer. We get a lot of requests from customers. That’s very exciting for us.”

Authenticity and AI

In talking about what comes next for the industry, Oshima raised two apparently conflicting trends.

The first is authenticity, he says: “imaging authenticity and concerns about AI. We believe there are two ways to address this: to protect the creators’ rights and use the AI powers.”

This leaves Sony in the same position as Adobe: pushing to develop AI features while also working as part of the Content Authenticity Initiative to prove the absence of its use.

“Both of them are very important for us to implement in our technology,” he says.

In dedicated cameras this is unlikely to mean generative AI techniques, he suggests: “we will focus on how to help our creators shoot as they want: how to implement creators’ insights into the camera functions.”

And this is likely to mean both features to help with shooting and with optimizing the results: “we’ll provide more post-production functions but also more real-time functions.”

These will go beyond what we’ve already seen, he says: “We still have room to accelerate our AF functions using AI autofocus, not only in the camera body but also lens functions have room to improve by utilizing AI power.”

The next question, he says, is where the computation is conducted: “The machine-learning itself is [currently] calculated in the cloud and implemented in the body while remaining on the cloud side. We are not sure whether it’s better to have that AI occurring in the [camera] body.”

“We have to utilize AI power, whether in the body or cloud-side”

For instance, he says: “More complicated lens correction can be realized on the cloud side. In the camera body, complex processes are difficult because of the machine power, but the cloud can realize more complex compensation.”

“Anyway we have to utilize AI power, whether in the body or cloud-side. That’s why our trend will be [to continue to improve the] connection function.”

“We have now just launched the PDT FP1 connection device, data transmitter combined with camera and the data transmitted to the cloud this is a first step. We will utilize these functions to calibrate our data in the cloud and back to the camera, in the future.”

Global shutter

While our conversation about AI stayed in the realm of vagueness, as talk of future features tends to, we tried to pin Oshima down a little more on the potential of the global shutter technology introduced in the a9 III.

Interestingly he didn’t make grand promises of it becoming an essential feature across the range, instead suggesting it needed to prove its value first.

“Before thinking about the big future, we have to focus on how to broaden the a9 III’s global shutter benefit into the market. So starting with the professionals, that’s the key,” he says: “[for] now we are focusing on how to broaden our global shutter benefit in the professional market.”


This article is based on an interview conducted by Dale Baskin and Richard Butler at the CP+ expo in Yokohama, Japan.

Author:
This article comes from DP Review and can be read on the original site.

Related Posts