bt_bb_section_bottom_section_coverage_image

Adobe Develops SlimLM That Can Process Documents Locally on Devices Without Internet Connectivity

Researchers at Adobe have recently released a paper outlining a novel artificial intelligence (AI) model with the ability to analyze documents directly on a device. The paper, released last week, emphasizes the researchers’ exploration of utilizing both large language models (LLMs) and small language models (SLMs) to determine ways to decrease the AI model’s size while maintaining high processing capability and inference speed. As a result of their experiments, the researchers successfully created an AI model called SlimLM that is capable of operating within a smartphone and handling document processing.

Adobe Researchers have successfully developed SlimLM.


An important application of generative AI is AI-powered document processing, enabling a chatbot to respond to user questions regarding the content. Numerous companies, such as Adobe, have utilized this application and introduced tools that provide this functionality. However, a minor inconvenience associated with all of these tools is that the AI processing occurs in the cloud. Concerns regarding data privacy are heightened when processing data on servers, posing a potential risk for documents with sensitive information.

The primary concern arises from worries that the company providing the solution may use it to train the AI or that a data breach could lead to the leakage of sensitive information. In response, Adobe researchers delicately outlined a new AI model capable of performing document processing solely on the device, which they published in the online journal arXiv.

Named SlimLM, the smallest version of the AI model comprises only 125 million parameters, making it suitable for integration into a smartphone’s operating system. The researchers assert that it is capable of functioning on a local level without requiring Internet connection. Users can confidently handle even the most delicate documents, knowing that the data remains secure within the device at all times.

The researchers emphasized in their paper that they carried out multiple experiments on a Samsung Galaxy S24 in order to determine the optimal balance between parameter size, inference speed, and processing speed. Following optimization, the team pre-trained the model on the SlimPajama-627B base model and then further refined it using DocAssist, a specialized software designed for document processing.

It is worth noting that arXiv is a pre-print journal that does not necessitate peer reviews for publishing. Consequently, it is uncertain whether the claims presented in the research paper can be verified. Nevertheless, if confirmed, the AI model has the potential to be integrated into Adobe’s platforms down the line.

Share
× WhatsApp