Home > Technology peripherals > AI > Exploring the innovation trend of AI large models in the field of terminal operating systems, the OS native intelligence sub-forum of the OpenHarmony Technology Conference was held.

Exploring the innovation trend of AI large models in the field of terminal operating systems, the OS native intelligence sub-forum of the OpenHarmony Technology Conference was held.

WBOY
Release: 2023-11-06 11:29:11
forward
904 people have browsed it

On November 4, the second OpenHarmony Technology Conference of the Open Atom Open Source Foundation was successfully held in Beijing. In the OS native intelligence sub-forum held in the afternoon, Jin Xuefeng, chief architect of Huawei Shengsi MindSpore, served as the producer, and Wang Lei, architect of Huawei's AI large model application development platform, served as the moderator. Participants from Huawei, Chinasoft International Co., Ltd., Shenzhen Ruoxin Technology leaders from the corporate world such as Technology Co., Ltd. and academic experts and scholars from Tsinghua University, Shanghai Jiao Tong University, etc. shared how their respective fields combined AI large model-related technologies in the OpenHarmony ecosystem to innovate and break through technical problems, and jointly discussed the future of AI. The technological development trend of combining large models with terminal operating systems contributes to the joint construction of a prosperous OpenHarmony ecosystem.

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Jin Xuefeng is the chief architect of Huawei Shengsi MindSpore

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Moderator: Wang Lei, Architect of Huawei AI Large Model Application Development Platform

First of all, Li Yuanchun, an assistant researcher from Tsinghua University, gave a report on the topic of "Large Model Driven Terminal Intelligent Agent", introduced the design and implementation of a large model driven intelligent personal agent system, and shared how to pass Automatic analysis of mobile applications and large model knowledge embedding organically combines the domain knowledge within the application with the common sense knowledge of the large model to achieve more accurate and efficient task automation. Li Yuanchun pointed out that intelligent personal agents have always been one of the key technologies that terminal system researchers and developers pay attention to. However, due to insufficient capabilities such as user intention understanding, task planning, and tool use, the current intelligence and scalability of intelligent personal agents have been limited. There are still major flaws. The emergence of large language models has solved these difficulties. Li Yuanchun said: "The emergence of large models represented by large language models has brought new opportunities to the development of the field. It is expected to greatly improve the development of the field through powerful semantic understanding and common sense reasoning capabilities. Improve the breadth and depth of intelligent personal agent support functions.”

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Li Yuanchun, assistant researcher at Tsinghua University, will share the topic

Zhang Zhaosheng is the general manager of the Intelligent Internet of Things Legion Product R&D Management Department of ChinaSoft International Co., Ltd. and a member of the Technical Steering Committee of the OpenHarmony project group. Based on his profound insights into the field of large models and device-edge collaboration, he made a report titled "Visual Large Models in OpenHarmony's Device-Edge-Cloud Application". He pointed out that in the era of intelligence, with the continuous improvement of large models and underlying computing power technologies, it has become an inevitable trend to deploy large models on the edge and terminal sides. He highly affirmed the value and significance of OpenHarmony as a digital base, "As a technical base in the era of Internet of Everything, OpenHarmony can be widely used in a variety of computing scenarios, meeting the business needs of multiple connections, high real-time and massive heterogeneous data. It’s time to exert hardware computing power downwards and enable massive applications upwards.” In Zhang Zhaosheng’s view, using OpenHarmony as the basis to integrate edge-end large model capabilities and build end-edge collaborative business scenarios can provide the industry with more competitive innovations Solutions to accelerate the expansion of the ecosystem

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Zhang Zhaosheng, General Manager of the Intelligent IoT Legion Product R&D Management Department of ChinaSoft International Co., Ltd. and member of the Technical Steering Committee of the OpenHarmony Project Group shared the theme

Zhou Jianhui, architect of Huawei Xiaoyi, gave a themed report on "Exploration of Ecological Construction of Device-side Native Intelligent Large Model Based on Intent Framework" based on Huawei Xiaoyi's thinking on intelligent service technology in terminal products. This report mentioned that the intent framework is a system-level intent standard system that builds a global intent paradigm through multi-dimensional system perception, large models and other capabilities to achieve an understanding of users' explicit and potential intentions, and timely and accurately identify user needs. Pass it on to ecological partners, match timely services, and provide users with multi-modal and personalized advanced scene experiences.

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Huawei Xiaoyi architect Zhou Jianhui’s theme sharing

Zheng Wenli, associate professor of the Department of Computer Science and Engineering of Shanghai Jiao Tong University, focused on machine learning methods and gave a report on "Dynamic Segmentation of Deep Neural Networks in Device-Edge-Cloud Collaborative Inference". In the device-edge cloud environment, in order to match the dynamic changes of network load and server load, Professor Zheng Wenli proposed a DNN segmentation optimization algorithm based on machine learning and implemented it on Shengsi MindSpore, so that it can execute a variety of common CNN , RNN automatically adjusts their distribution across end, edge/cloud sides to maintain the lowest inference latency. At the same time, he also pointed out that "resource limitations are the main challenge facing edge computing. Only by trying to break through resource limitations can we maximize edge advantages and make intelligence truly enter everyone's life."

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Zheng Wenli, associate professor of the Department of Computer Science and Engineering, Shanghai Jiao Tong University, shared the topic

Hou Lu, a researcher at Huawei's Noah's Ark Laboratory, shared a report titled "Compression Acceleration and Efficient Deployment of Large Language Models" at the scene. This report introduces the memory, memory access and computing challenges faced by large language models in the inference phase from aspects such as architectural design, cost, throughput, latency, long sequences, etc., and discusses the model and KVcache quantization compression and fusion of large operators. and the benefits brought by speculative reasoning in efficient reasoning of large language models

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Hou Lu, a researcher at Huawei’s Noah’s Ark Laboratory, shared the topic

Zhou Peng, CTO of Shenzhen Ruoxin Technology Co., Ltd., gave a speech on "Ubiquitous Intelligence: Applications and Solutions of End-to-End Brain-inspired Large Models" and introduced to the participants the challenges facing large language models. The problem of computing power overhead, and how to improve the Transformer architecture using a human brain-like way of processing information. By using the third generation neural network SNN to build a neural network, the data in the network is stored and transmitted using pulse sequences, which greatly reduces the inference overhead without affecting the intelligence of the network. Zhou Peng said: "The large model trained using this technology not only achieves the optimal energy consumption, delay, and computing power requirements at the same parameter level, but also successfully has the ability to be deployed and run locally on consumer-grade devices."

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Shenzhen Ruoxin Technology Co., Ltd. CTO Zhou Peng’s theme sharing

Li Zheng, the architect of Huawei Shengsi MindSpore, shared the report "MindSpore client-side large model deployment helps OS intelligence" based on the technical exploration and practical experience of the MindSpore framework. He explained how to solve the deployment and application challenges of large AI models. Li Zheng mentioned: "Generative AI large models are quietly changing people's work and life. The combination of AI large models and smart terminals will inevitably lead to new experiences. However, due to the computing power and storage constraints of terminal devices, the AI ​​large model end Side deployment faces many challenges." In the report, he introduced MindSpore's exploration and related technical ideas in side-side deployment and inference acceleration of large models. As an open source AI framework that integrates cloud-edge and full-scenario training and promotion, MindSpore is deeply involved in the performance optimization of HarmonyOSAI intelligent business scenarios. It has been widely used in Huawei mobile phones, tablets, watches, notebooks, smart screens, smart cars and other terminal products

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

Huawei Shengsi MindSpore Architect Li Zheng’s theme sharing

In the final session of the OS native intelligence sub-forum, the forum host Wang Lei, architect of Huawei AI large model application development platform, Jin Xuefeng, chief architect of Huawei Shengsi MindSpore, the producer, and the guest speakers gave a speech on the theme of "Device-side large model" The roundtable discussion further discussed the technical concepts and ideas of AI application architecture and framework, model training framework, model architecture and algorithm under OS native intelligence.

探索终端操作系统领域AI大模型创新趋势 OpenHarmony技术大会OS原生智能分论坛召开

"Technological Frontier Challenges and Solution Ideas for Device-side Large Models and Applications" Roundtable Discussion

So far, the OS native intelligence sub-forum of the second OpenHarmony Technology Conference has come to a successful conclusion. The sharing by technical experts and industry leaders demonstrated the technical potential and application practice of OpenHarmony in the field of native intelligence. It also predicted that the deep integration of AI large models and operating terminals will surely bring about the expansion of large model capabilities and the development of terminal equipment. A smart “win-win” prospect. As more and more technical experts and industry elites in the field of native intelligence join the OpenHarmony community, the OpenHarmony ecosystem will surely flourish. We look forward to more partners joining in the joint construction of OpenHarmony technology and ecology, and working together to "build the ecology with technology, and win the future with intelligence".

The above is the detailed content of Exploring the innovation trend of AI large models in the field of terminal operating systems, the OS native intelligence sub-forum of the OpenHarmony Technology Conference was held.. For more information, please follow other related articles on the PHP Chinese website!

source:sohu.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template