Home > Technology peripherals > AI > NeurIPS2024 Edge Device Large Language Model Challenge

NeurIPS2024 Edge Device Large Language Model Challenge

PHPz
Release: 2024-08-07 16:03:22
Original
532 people have browsed it

With the rapid development of artificial intelligence, especially in the field of natural language processing (NLP), large language models (LLM) have shown great potential for transformation. These models are changing the way we work and communicate, and show a wide range of applications across a variety of computing devices. However, the huge model of LLM brings considerable challenges to its application on edge devices such as smartphones, IoT devices, and vehicle-mounted systems. Our competition aims to push the limits of LLM's performance, efficiency, and multitasking capabilities on resource-constrained edge devices. Competition background: Although LLM has huge application potential, its huge parameters place severe demands on the resources of edge devices. For example, a 10B parameter LLM model, even after quantization processing, requires up to 20GB of memory, and the memory capacity of most smartphones is far from meeting this requirement. In addition, the high energy consumption of LLM is also a big problem. Usually a fully charged smartphone can only last less than two hours when using LLM for conversation. Main challenge: Memory requirements: LLM inference requires a large amount of memory, which even high-end smartphones cannot carry. Energy consumption issue: High energy consumption during LLM inference poses a challenge to the battery life of smartphones. Performance loss: Maintaining model performance while achieving a high compression ratio is a major problem with existing technology. Lack of offline functionality: Most LLMs require an Internet connection, which limits their application in environments with unstable networks. Competition Goals: This competition aims to solve the above challenges and promote the practical application of LLM on edge devices. We invite researchers, engineers, and industry professionals from various fields to participate to jointly design systems, hardware, and algorithms to enable the deployment of high-performance LLM on edge devices. Join our NeurIPS Edge Device LLM Challenge! We sincerely invite experts in various fields to participate in the NeurIPS Challenge to demonstrate LLM's capabilities on edge devices. The top three in the competition will share a total prize pool of 300,000. Competition Track: Compression Challenge: Show how to compress a pre-trained LLM without significant performance loss. Training challenge from scratch: Train an LLM designed for edge devices from scratch, optimizing the training process to create efficient models. Reasons to participate: Win the grand prize: a total prize pool of 300,000. Driving innovation: Contribute to cutting-edge research at LLM. Grow your network: Connect with leading experts and peers in your field. Showcase your talent: Get recognized at top AI conferences. Important dates: Registration starts: June 25, 2024 Registration deadline: July 25, 2024 Submission deadline: October 25, 2024 Result announcement: November 20, 2024 Offline Workshop: How about December 11, 2024 To participate: Registration: Register via Google Forms on the official competition website, official competition website. Choose a track: Choose from a compression challenge, a training from scratch challenge, or both. Submit your solution: Submit your model before the deadline. Are you ready to take on the challenge? Register now and start preparing for your competition! For more information and registration please visit: Discord: Link Registration Link: Link Let's work together to promote the development of LLM on edge devices. Wish you good luck and look forward to seeing you at NeurIPS 2024! ?NeurIPS Edge Device LLM Challenge Organizer:

NeurIPS2024 Edge Device Large Language Model Challenge

The above is the detailed content of NeurIPS2024 Edge Device Large Language Model Challenge. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:jiqizhixin.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template