SenseTime 10 years on still has advantages as Chinese AI challenges mount, CEO says

By South China Morning Post | Created at 2024-10-17 13:19:02 | Updated at 2024-10-17 15:36:57 2 hours ago
Truth

Chinese artificial intelligence (AI) company SenseTime celebrated its 10th anniversary on Thursday at its first annual Global AI Summit since the death of co-founder Tang Xiao’ou, with CEO Xu Li highlighting the firm’s chip-agnostic approach as an advantage in the coming years amid an influx of new industry challengers.

“We actually adapt our AI algorithms to more than 50 chipsets, so that’s why our [infrastructure] has a strong [operating system] layer, which is transparent to our users,” Xu told the South China Morning Post in an interview on the sidelines of the event at Science Park. “Some other [companies] also have developed their own chips … but to us, we actually embrace partners in the market.”

SenseTime’s first decade comes to a close in an eventful year for the company. The pivot to generative AI – popularised two years ago by OpenAI’s ChatGPT – helped push revenue up 21 per cent in the first half of the year. Its stock is also up 30 per cent for the year to HK$1.56 (22 US cents), which has been weigh down by US sanctions since its debut on the Hong Kong stock exchange in December 2021.

 Matt Haldane

SenseTime CEO Xu Li speaks at the company’s 10th Anniversary Global AI Summit event on October 17, 2024. Photo: Matt Haldane

Xu acknowledged that the company faces challenges from the US-China geopolitical tensions, but he noted that this is not unique to SenseTime.

“It’s definitely a challenge, but it’s not a challenge only to SenseTime, but I think to most Chinese tech companies,” he said. “That’s why we have developed other systems adapting to different, for example, China-made GPU [graphics processing unit] chips.”

Xu reiterated that the company remains on track to become profitable by 2026, which he said will come from efficiency gains in how it uses GPU resources.

“Our goal is using recent technologies to optimise the inference efficiency so that everybody can make it affordable,” he said.

Read Entire Article