current location:Home>Technical Articles>Technology peripherals>AI
- Direction:
- All web3.0 Backend Development Web Front-end Database Operation and Maintenance Development Tools PHP Framework Daily Programming WeChat Applet Common Problem Other Tech CMS Tutorial Java System Tutorial Computer Tutorials Hardware Tutorial Mobile Tutorial Software Tutorial Mobile Game Tutorial
- Classify:
-
- Why AI forensics will be important in 2024
- 2024 will be critical to maintaining the integrity, security, and trustworthiness of AI systems. In the rapidly developing technological field, artificial intelligence has become the cornerstone of innovation in various fields. However, as AI becomes integrated into critical infrastructure and workflows, the need for AI forensics is more evident than ever. Therefore, artificial intelligence forensics has become one of the innovative infrastructures in various fields. In the rapidly developing technological field, artificial intelligence has become the cornerstone of innovation in various fields. However, as AI becomes integrated into critical infrastructure and workflows, the need for AI forensics is more evident than ever. Therefore, artificial intelligence forensics has become one of the innovative infrastructures in various fields. In 2024, in order to ensure that artificial intelligence
- AI 762 2024-04-11 12:55:14
-
- Large models are also very powerful in time series prediction! The Chinese team activates new capabilities of LLM and achieves SOTA beyond traditional models
- The potential of large language models is stimulated - high-precision time series prediction can be achieved without training large language models, surpassing all traditional time series models. Monash University, Ant and IBM Research jointly developed a general framework that successfully promoted the ability of large language models to process sequence data across modalities. The framework has become an important technological innovation. Time series prediction is beneficial to decision-making in typical complex systems such as cities, energy, transportation, and remote sensing. Since then, large models are expected to revolutionize time series/spatiotemporal data mining. The general large language model reprogramming framework research team proposed a general framework to easily use large language models for general time series prediction without any training. Two key technologies are mainly proposed: timing input reprogramming; prompt prefixing. Time-
- AI 347 2024-04-11 09:43:20
-
- Mistral open source 8X22B large model, OpenAI updates GPT-4 Turbo vision, they are all bullying Google
- There is really a trend of encircling and suppressing Google! When Google made a series of major releases at the CloudNext conference last night, everyone came to grab the attention: OpenAI updated GPT-4Turbo, and Mistral open sourced the 8X22B super large model. Google's heart: A group of children in Nancun bullied me because I am old and weak. The second largest open source model: Mixtral8X22B In January this year, MistralAI announced the technical details of Mixtral8x7B and launched the Mixtral8x7B–Instruct chat model. The model's performance significantly exceeds GPT-3.5Turbo, Claude-2.1, GeminiPro and Llama on human evaluation benchmarks
- AI 893 2024-04-10 17:37:27
-
- Building a digital, decarbonized energy future: Technology-driven green transformation
- Against the backdrop of increasingly severe global climate change, mankind is facing serious climate and energy crises. To achieve sustainable development and protect our planet, we need to take aggressive action and move towards a digital, decarbonized energy future. Today, we briefly discuss how to use artificial intelligence, Internet of Things, big data and other technologies to solve current climate and energy challenges and create a green, low-carbon future. First, artificial intelligence technology can play an important role in energy management and intelligence. Through intelligent energy systems, we can achieve efficient energy utilization and automated control, as well as dynamic adjustments to energy needs. Artificial intelligence technology can optimize energy distribution and use, reducing energy consumption and waste. Secondly, IoT technology can realize the interconnection of energy equipment
- AI 804 2024-04-10 15:22:09
-
- The open source model wins GPT-4 for the first time! Arena's latest battle report has sparked heated debate, Karpathy: This is the only list I trust
- An open source model that can beat GPT-4 has appeared! The latest battle report of the large model arena: the 104 billion parameter open source model CommandR+ climbed to 6th place, tying with GPT-4-0314 and surpassing GPT-4-0613. Image This is also the first open-weight model to beat GPT-4 in the large model arena. The large model arena is one of the only test benchmarks that the master Karpathy trusts. Image CommandR+ from AI unicorn Cohere. The co-founder and CEO of this large model startup is none other than Aidan Gomez, the youngest author of Transformer (referred to as the wheat harvester). As soon as this battle report came out, another wave of big model clubs started
- AI 615 2024-04-10 15:16:14
-
- Is the Llama architecture inferior to GPT2? Magical token improves memory 10 times?
- How much human knowledge can a 7B scale language model LLM store? How to quantify this value? How will differences in training time and model architecture affect this value? What impact will floating-point compression quantization, mixed expert model MoE, and differences in data quality (encyclopedia knowledge vs. Internet garbage) have on the knowledge capacity of LLM? The latest research "Language Model Physics Part 3.3: Scaling Laws of Knowledge" by Zhu Zeyuan (MetaAI) and Li Yuanzhi (MBZUAI) used massive experiments (50,000 tasks, a total of 4,200,000 GPU hours) to summarize 12 laws for LLM under different files The knowledge capacity provides a more accurate measurement method.
- AI 1130 2024-04-10 15:13:13
-
- iFlytek Spark V3.5 is officially released, based on the national computing power platform 'Flying Star One' training
- iFlytek will hold the Spark Cognitive Large Model V3.5 upgrade conference on January 30. Liu Qingfeng, Chairman of iFlytek, and Liu Cong, Dean of the Research Institute, officially released iFlytek Spark V3.5 based on the first national industrial computing power training. iFlytek announced that on October 24, 2023, it would release the first Wanka domestic computing power platform "Feixing No. 1" that supports large-scale model training with trillions of parameters, and it would be officially launched. In the more than 90 days since its launch, iFlytek Spark has launched large-scale model training with larger parameters to benchmark GPT-4 based on "Flying Star One", resulting in the iFlytek Spark V3 on January 30. 5 upgrade released. Based on the National Open Large Model training, the National Open Large Model Huo V3.5 has excellent performance in language understanding, text generation, knowledge question and answer, logical reasoning, mathematical ability, and generation.
- AI 903 2024-04-10 14:49:01
-
- Circular Intelligence joins hands with Dark Side of the Moon to build industry-wide model solutions and applications
- RecurrentAI, a provider of large model solutions for traditional industries, announced that it has reached an in-depth strategic cooperation with MoonshotAI, a company focusing on the research and development of general large models. Circular Intelligence will be based on Dark Side of the Moon's universal large model to provide the industry with better industry large model solutions and applications for various business scenarios. This cooperation will bring more advanced and intelligent solutions to the industry. Based on Dark Side of the Moon's universal large model, Circular Intelligence can better solve industry large model solutions and application needs in various business scenarios. At the same time, through in-depth cooperation, both parties will jointly promote the application of artificial intelligence in traditional industries and accelerate the digital transformation of the industry. Circular Intelligence CEO Chen Qicong said: “In various
- AI 755 2024-04-10 14:37:02
-
- A full breakthrough, Google updated a large number of large model products last night
- On Tuesday, Google released a series of AI-related model updates and products at Google's CloudNext2024, including Gemini1.5Pro which provides local speech (speech) understanding functions for the first time, a new code generation model CodeGemma, and the first self-developed Arm processor Axion and so on. Gemini1.5ProGemini1.5Pro, Google’s most powerful generative AI model, is now available in public preview on VertexAI, Google’s enterprise-focused AI development platform. This is Google’s AI development platform for enterprises. The context it can handle increases from 128,000 tokens to 1 million tokens.
- AI 1089 2024-04-10 14:34:09
-
- More than 13 times faster than manual work, 'robot + AI' discovers the best electrolyte for batteries and accelerates materials research
- Editor | Ziluo's traditional material research and development model mainly relies on "trial and error" experimental methods or accidental discoveries, and its research and development process usually takes 10-20 years. Data-driven methods based on machine learning (ML) can accelerate the design of new materials for clean energy technologies. However, its practical application in materials research is still limited due to the lack of large-scale high-fidelity experimental databases. Recently, research teams from the Pacific Northwest National Laboratory and Argonne National Laboratory in the United States designed a highly automated workflow that combines a high-throughput experimental platform with the most advanced active learning algorithms to effectively screen for anolyte electrolytes. Binary organic solvent for optimal solubility. The goal of this research is to improve the performance and stability of energy storage systems to promote renewable energy
- AI 324 2024-04-10 13:30:20
-
- IEEE Interpretable AI Architecture Standard P2894 Officially Released
- Explainable AI (XAI) is an emerging branch of artificial intelligence that is used to analyze the logic behind every decision made by artificial intelligence capabilities. It is one of the core concerns for the sustainable development of artificial intelligence. With the advent of the era of large models, models are becoming more and more complex, and paying attention to interpretability is of great significance to improving the transparency, security, and reliability of artificial intelligence systems. The international standard IEEE P2894 for explainable AI was released, opening the "black box" of AI. Recently, the IEEE Standards Association's standard P2894 (Guideforan Architectural Framework for Explainable Artificial Int) on explainable AI architecture
- AI 906 2024-04-10 13:25:15
-
- Registration channel for China Embodied Intelligence Conference 2024 is open | CEAI 2024
- ■Participation information■Meeting time: March 29-31, 2024■Meeting location: Xi'an Zhi Tower, Xuhui District, Shanghai ■Travel and accommodation: Participants must present their ticket purchase and registration information to participate in the meeting, travel and food during the meeting Accommodation is at your own expense (the registration fee includes some meals, please see the instructions below) ■Ticket information: CAAI member registration: https://caai.kejie.org.cn/member/, please select "Embodied Intelligence Special Committee (Preparation)" for development source ”■Payment method: 1. Alipay ticket purchase 2. Bank transfer ticket purchase Account name: China Artificial Intelligence Society Account number: 0200002909200166203 Account bank: Industrial and Commercial Bank of China Xinjiekou Branch Notes: 1. No matter which one you choose
- AI 937 2024-04-10 13:19:08
-
- Learn how artificial intelligence enables machines to learn in one article?
- In the article "Understanding Artificial Intelligence (AI) in One Article" we introduced the complex problems that AI can solve. These problems cannot be solved by fixed rules and require machines to make final judgments based on comparative evaluations based on past examples. Machines need to imitate humans in learning things. The human learning process can be divided into the following stages: perception, memory, comparison and induction, summary and practice. Perception: Obtaining information from the outside world through one's own senses (vision, hearing, touch, taste and smell), observing and experiencing the surrounding environment and what is happening. In addition to personal experience, in order to obtain information more efficiently, people listen to stories, read books and watch videos to understand what happened in the past. Memory: When we acquire new information, we
- AI 946 2024-04-09 18:31:27
-
- Eight stark realities of the skills gap in the era of industrial automation
- The Industry 4.0 skills gap is becoming increasingly apparent as industrial automation takes center stage in today's economy, revolutionizing the way products are produced and services are delivered. With the integration of artificial intelligence, robotics and the Internet of Things, this technological leap optimizes efficiency and reshapes the job market. In an industrial automation environment where businesses and workers continue to evolve, skills and navigation of these advanced systems are becoming increasingly common. A key challenge emerges – the huge gap between existing skills and the advanced capabilities needed to navigate this new era. To address this skills gap, there is a huge gap between the skills of existing members and the advanced capabilities needed to navigate this new era. To address this skills gap, businesses and workers need to continue to learn and adapt to ensure they thrive in this changing world.
- AI 1028 2024-04-09 18:16:15
-
- Peking University's most powerful open source aiXcoder-7B code model! Focus on real development scenarios and designed for enterprise private deployment
- Judging from the latest developments in the technology circle, the concept of AI code generation has become very popular recently. However, friends, do you feel that AI programming questions are more eye-catching, but when it comes to real enterprise development scenarios, you always feel that it is not enough? At this time, a low-key senior player aiXcoder took action and released a big move: it is a new open source code model-aiXcoder-7BBase version, a code model specifically suitable for deployment in enterprise software development scenarios. Wait, what kind of AI programming level can a large code model with "only" 7 billion parameters show? Let’s first take a look at its performance on the three mainstream evaluation sets of HumanEval, MBPP and MultiPL-E. Its average score actually exceeds that of Co with 34 billion parameters.
- AI 1170 2024-04-09 18:10:02