Project

Chat2DB, a database client tool integrated with AIGC

Chat2DB is an open source and free multi-database client tool that supports local installation on windows and mac, server-side deployment and web page access. Compared with traditional database client software Navicat and DBeaver, Chat2DB integrates the ability of AIGC, which can convert natural language into SQL, and can also convert SQL into natural language, and […]

Chat2DB, a database client tool integrated with AIGC Read More »

React Ink Homepage, Documentation and Downloads – React for Command Line Applications – News Fast Delivery

Ink is a library that supports using React to build command-line applications—the same component-based UI building experience that React provides in the browser. It’s just that InK is geared towards command-line applications. So Ink is actually a React renderer, which converts the React component tree into a string, and then outputs it to the terminal.

React Ink Homepage, Documentation and Downloads – React for Command Line Applications – News Fast Delivery Read More »

SuperCLUE Homepage, Documentation and Downloads – Chinese General Large Model Evaluation Benchmark – News Fast Delivery

SuperCLUE is an evaluation benchmark for general large models available in Chinese. The main question it answers is: the effect of the Chinese large-scale model under the current situation of vigorous development of the general-purpose large-scale model. including but not limited to: The effect of these models on different tasks To what extent

SuperCLUE Homepage, Documentation and Downloads – Chinese General Large Model Evaluation Benchmark – News Fast Delivery Read More »

Chinese-LLaMA-Alpaca Homepage, Documentation and Downloads- Chinese LLaMA & Alpaca Large Model- News Fast Delivery

Chinese-LLaMA-Alpaca contains the Chinese LLaMA model and the Alpaca large model fine-tuned with instructions. Based on the original LLaMA, these models expand the Chinese vocabulary and use Chinese data for secondary pre-training, which further improves the ability to understand the basic semantics of Chinese. At the same time, the Chinese Alpaca model further uses Chinese

Chinese-LLaMA-Alpaca Homepage, Documentation and Downloads- Chinese LLaMA & Alpaca Large Model- News Fast Delivery Read More »

OpenLLaMA Homepage, Documentation and Downloads – An Open Source Replication of the LLaMA Large Language Model – News Fast Delivery

OpenLLaMA is an open source rendition of Meta AI’s LLaMA large language model under a permissive license. The repository contains a public preview of the trained 200 billion token 7B OpenLLaMA model, and provides PyTorch and Jax weights for the pretrained OpenLLaMA model, as well as evaluation results and comparisons to the original LLaMA model.

OpenLLaMA Homepage, Documentation and Downloads – An Open Source Replication of the LLaMA Large Language Model – News Fast Delivery Read More »

MLC LLM Homepage, Documentation and Downloads – Local Large Language Model – News Fast Delivery

MLC LLM is a general solution that allows any language model to be deployed locally on various hardware backends and native applications. In addition, MLC LLM also provides an efficient framework for users to further optimize model performance according to their needs. MLC LLM is designed to enable everyone to develop, optimize and deploy AI

MLC LLM Homepage, Documentation and Downloads – Local Large Language Model – News Fast Delivery Read More »

National secret SM4 encryption and decryption SM4Utils

SM4Utils is a symmetric encryption and decryption tool that encapsulates the national secret SM4 algorithm, supports JavaScript and Java, and this is the Java version.import using icu.xuyijie SM4Utils 1.4.1 Use /** * ECB encryption mode */ //Do not use custom secretKey, generally used for back-end self-encryption//If it is front-end encryption… #National #secret #SM4 #encryption #decryption

National secret SM4 encryption and decryption SM4Utils Read More »

Base64Util Homepage, Documentation and Download – Base64 Codec Tool – News Fast Delivery

Base64Util canSave the file generated by Base64 decoding to the specified path; or convert the file into Base64 code, supported language: java. <!– https://mvnrepository.com/artifact/icu.xuyijie/Base64Utils –> <dependency> <groupId>icu.xuyijie</groupId> <artifactId>Base64Utils</artifactId> <version>1.2.3</version> </dependency> // 将文件编码成Base64,可传入文件全路径,或者一个 File 对象 String s = Base64Util.transferToBase64(“D:/下载/Screenshot_20221008-090627.png”); File file = new File(filePath); String s = Base64Util.transferToBase64(file); System.out.println(s); // 将Base64转换成文件保存到指定位置,可传入文件全路径或者分别传入保存位置和文件名 String s1 = Base64Util.generateFile(s, “D:/下载/aaa.png”);

Base64Util Homepage, Documentation and Download – Base64 Codec Tool – News Fast Delivery Read More »

WizardLM Homepage, Documentation and Downloads – Fine-tuning Large Language Model Based on LLaMA – News Fast Delivery

WizardLM is a fine-tuned 7B LLaMA model. It fine-tunes following the dialogue through a large number of commands of varying difficulty. The novelty of this model is the use of LLM to automatically generate training data. The WizardLM model uses a new method called Evol-Instruct (a new method to improve the ability of LLM by

WizardLM Homepage, Documentation and Downloads – Fine-tuning Large Language Model Based on LLaMA – News Fast Delivery Read More »

SantaCoder Homepage, Documentation and Downloads- Lightweight AI Programming Model- News Fast Delivery

SantaCoder is a language model with 1.1 billion parameters that can be used for code generation and completion suggestions in several programming languages ​​such as Python, Java, and JavaScript. According to the official information, the basis for training SantaCoder is The Stack (v1.1) data set. Although SantaCoder is relatively small in size, with only 1.1

SantaCoder Homepage, Documentation and Downloads- Lightweight AI Programming Model- News Fast Delivery Read More »

Enlightenment Homepage, Documentation and Downloads – Bilingual Multimodal Large Language Model – News Fast Delivery

“Enlightenment” is a bilingual multimodal pre-training model with a scale of 1.75 trillion parameters.There are currently 7 open source model results in the project, and the model parameter files need to beEnlightenment platformMake a download request. Graphics CogView The CogView parameter is 4 billion. The model can generate images from text, and after fine-tuning, it

Enlightenment Homepage, Documentation and Downloads – Bilingual Multimodal Large Language Model – News Fast Delivery Read More »

Pengcheng·Pangu α Homepage, Documentation and Download- Chinese Pre-trained Language Model- News Fast Delivery

Pengcheng Pangu α is the industry’s first 200 billion parameter pre-trained language model with Chinese as the core. Currently, two versions are open source: Pengcheng Pangu α and Pengcheng Pangu α enhanced version, and support both NPU and GPU. It supports rich scene applications, and has outstanding performance in text generation fields such as knowledge

Pengcheng·Pangu α Homepage, Documentation and Download- Chinese Pre-trained Language Model- News Fast Delivery Read More »