Home > Technology peripherals > AI > body text

The domestic open source version of 'ChatGPT plug-in system' is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

王林
Release: 2023-05-24 14:25:07
forward
1391 people have browsed it

Recently, an open source project called "ChatGPT Plugins Domestic Alternative System" has a sharp increase in stars on GitHub.

This project is BMTools, a large model tool learning engine developed by Wallface Intelligence.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

## Project address: //m.sbmmt.com/link/a330f9fecc388ce67f87b09855480ca3

Deeply explore the cutting-edge and quickly embed large model tool learning

First of all, the most important question is, what is so great about BMTools?

As an open source scalable tool learning platform based on language models, the wall-facing R&D team has unified various tool calling processes into the BMTools framework, making the entire tool calling Process standardization and automation.

Currently, the plug-ins supported by BMTools cover entertainment, academic, life and other aspects, including douban-film (Douban movie), search (Bing search), Klarna (shopping), etc. .

Developers can use BMTools to use a given model (such as ChatGPT, GPT-4) to call a variety of tool interfaces to implement specific functions.

In addition, the BMTools toolkit has also integrated the recently popular Auto-GPT and BabyAGI.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

##So, what effect does this kind of tool learning have on large models?

Although large models have achieved remarkable results in many aspects, they still have certain limitations in tasks in specific fields. These tasks often require specialized tools or domain knowledge to effectively solve.

Therefore, just like a smartphone needs to download an App to have a better user experience, large models need to have the ability to call various professional tools so that they can provide better solutions for real-world tasks. for full support.

The new paradigm of large model tool learning (Tool Learning) came into being. At the heart of this paradigm is the fusion of specialized tools with the strengths of underlying models to achieve greater accuracy, efficiency, and autonomy in problem solving.

The organic combination of large models and external tools has successfully made up for many shortcomings in previous capabilities, and tool learning has greatly unleashed the potential of large models.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

Paper address: https://arxiv.org/abs/2304.08354

On March 23, 2023, OpenAI announced the launch of the plug-in system (Plugins). The ability of this plug-in is what we call tool learning.

With the support of tool learning, Plugins can support ChatGPT to connect browsers, mathematical calculations and other external tools, greatly enhancing its capabilities.

The emergence of ChatGPT Plugins has supplemented the last shortcomings of ChatGPT, allowing it to support networking and solve mathematical calculations. It is called the "App Store" moment of OpenAI. However, until now, it was only supported for OpenAI Plus users and remained unavailable to most developers.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

Why can Mianbi launch BMTools only ten days after the release of ChatGPT Plugins?

Face Wall Intelligence has been concentrating on the development of efficient computing tools for the entire process of large models. The R&D team has continued to carry out research on new paradigms of tool learning since 2022, trying to integrate existing language models into Combined with search engines, knowledge bases and other tools, good experimental results have been achieved. The team has also conducted fruitful explorations in the cutting-edge research field of tool learning.

In order to satisfy the eager expectations of many developers for the capabilities of OpenAI Plugins, based on the previous accumulation, the team quickly tooled the relevant research results and accumulated them into Toolkit BMTools embeds tool learning into the wall-facing intelligent large model capability system and officially joins the OpenBMB large model tool system "Family Bucket".

Tool learning is also another masterpiece launched by Wallface Intelligence after the efficient training, fine-tuning, inference, and compression suite.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

##BMTools toolkit: https ://m.sbmmt.com/link/a330f9fecc388ce67f87b09855480ca3

Leading the wall, the first online support Chinese question and answer model

Recently,面面INTELLIGENCE teamed up with researchers from Tsinghua University, National People’s Congress, and Tencent to jointly release the first Q&A open source model framework WebCPM based on interactive web search in the Chinese field. This initiative has filled the gap of domestic large-scale models. Blank field. And WebCPM is the successful practice of BMTools.

Currently, WebCPM-related work has been accepted into ACL 2023, the top conference on natural language processing.

WebCPM paper link: https://arxiv.org/abs/2305.06849

WebCPM data Link to the code: https://github.com/thunlp/WebCPM

It can be said that since ChatGPT became popular, large models from various factions in China have sprung up like mushrooms after a rain. , but most models are not connected to the Internet.

However, large models that are not connected to the Internet cannot obtain the latest information, and the generated content is based on old data sets, which has certain limitations.

The characteristic of WebCPM is that its information retrieval is based on interactive web search, can interact with search engines like humans to collect factual knowledge required to answer questions and generate answers.

In other words, with the support of networking functions, the real-time and accuracy of answering questions of large models have been greatly enhanced.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

WebCPM Model Framework

WebCPM targets WebGPT, which is also the new generation search technology behind Microsoft’s recently launched New Bing.

Like WebGPT, WebCPMovercomes the shortcomings of the traditional LFQA (Long-form Question Answering) long text open question and answer paradigm: relying on non-interactive retrieval method, which retrieves information using only the original question as a query statement.

Under the WebCPM framework, the model can perform web searches and filter high-quality information by interacting with search engines in real time just like humans.

Not only that, when encountering complex problems, the model also breaks it down into multiple sub-problems like humans and asks questions in sequence.

Moreover, by identifying and browsing relevant information, the model will gradually improve its understanding of the original problem and continuously query new questions to search for more diverse information.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

##WebCPM search interactive interface

Future , Wall-facing intelligence will also further promote the application and transformation of this scientific research results, and strive to promote the implementation of the WebCPM large model in corresponding fields.

From a high position, we are committed to building a domestic large model system

Face Wall Intelligence has always strived to lead the original innovation of large models and is committed to building large model infrastructure and creation in the intelligent era Domestic large model system, with the aim of eventually realizing "let large models fly into thousands of households".

#The achievements of wall-facing intelligence are obvious to all and have been recognized by the industry.

Zhihu Chief Technology Officer Li Dahai once commented on Wall-Facing Intelligence: "The Wall-Facing Intelligence team is the first team in China to conduct large-scale language model research. The company reserves large model research and application With its full-stack technical capabilities, including fine-tuning technology and acceleration technology, its R&D capabilities are in an industry-leading position." Zhihu said that it believes that Wall-Facing Intelligence can grow into a core player in infrastructure in China's large-scale model field and contribute to China's large-scale model industry.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

Wall-facing intelligent panorama

Relying on Tool platform and large model library, the company launched ModelForce large model system and CPM enterprise-level large model. ModelForce, an AI productivity platform based on large models, has a built-in efficient computing tool system for the entire process of large model training, fine-tuning, compression, and inference.

The platform is based on the general capabilities of large models with few samples and zero samples. It uses large model standardized fine-tuning methods and creates zero-code fine-tuning clients, which can significantly reduce the cost of data annotation in the AI ​​R&D process. Computing power cost and labor cost.

CPM Large Model Enterprise Edition has upgraded its capabilities for the open source version model, and has the characteristics of multi-capability integration, incremental fine-tuning and flexible adaptation, and multi-scenario application.

Based on the CPM enterprise-level large model and the ModelForce large model system, Wallface Intelligence cooperated with Zhihu to train the "Zhihaitu AI" large model.

The "Zhihaitu AI" large model was applied to the Zhihu hot list, which can quickly extract elements, sort out opinions and aggregate content. It was presented at the Zhihu Discovery Conference on April 23 release.

It’s more than that. In fact, Wall-facing Intelligence stands high and has successfully created a "Trinity" large-model industry-university-research ecological pattern. By integrating the academic research power of top universities and continuing to build and operate the large-scale model open source community OpenBMB, Wall-facing Intelligence builds Create a closed-loop channel between industry demand, algorithm open source and industrial implementation, and strive to promote cutting-edge research, application research and development and industrial development in the field of domestic large models.

The domestic open source version of ChatGPT plug-in system is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.

  • ##OpenBMB Open Source Community
  • ## To contribute to the construction of domestic large model open source ecosystem, we have released a series of large model full-process open source toolkits including OpenPrompt, OpenDelta, BMInf, BMCook, BMTrain, BMTools, etc., and launched them on Zhihu, Bilibili and other platforms Large model open class for all.

  • Natural Language Processing and Social Humanities Computing Laboratory (THUNLP), Department of Computer Science, Tsinghua University

As a researcher in colleges and universities The research strength of Yiqi Juechen was established in the 1970s. It is the earliest and most influential scientific research unit in China to carry out NLP research. It has many well-known scholars and scientists working in it, and its research work in the field of large language models is very outstanding.

  • Wall-Facing Intelligence

is committed to the application of large models in typical scenarios and fields of artificial intelligence Application and implementation, the CPM large model is a pre-trained language large model self-developed by the Wallface team based on years of large model training experience. The company has now completed tens of millions of yuan in angel round financing, and has reached strategic cooperation with a number of well-known institutions.


In the journey of striving to build a domestic large-scale model system, Wall-Facing Intelligence’s vision has always been to enable the implementation of large-scale models to empower more industries and benefit more people. many companies and individuals.

The spark has started a prairie fire, and we look forward to large models releasing their potential in more fields and showing surprising application value.

The above is the detailed content of The domestic open source version of 'ChatGPT plug-in system' is here! Douban, search are all available, jointly released by Tsinghua University, Face Wall Intelligence, etc.. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!