• AI Research Insights
  • Posts
  • 🚀 AI News: Trending AI Research + Cool Github Repos + Trending AI Tools.. (July 11, 2023 Edition)

🚀 AI News: Trending AI Research + Cool Github Repos + Trending AI Tools.. (July 11, 2023 Edition)

This newsletter brings AI research news that is much more technical than most resources but still digestible and applicable

🔥 Trending AI Research: Let’s learn something new from the trending papers.

💻 Some Cool Github Repos: Take a deep dive into the world of advanced AI with these trending Github repos

🏃 Let’s Practice Some Generative AI: Let’s do some fun in the lab by trying some prompts and AI tools.

🛎️ Trending Tools: Check out some cool AI tools picked up by our editorial team.

Read Time: 5 Minutes

🔥Trending AI Research

1️⃣ DeepOnto: A Python Package for Ontology Engineering with Deep Learning [Paper] [Github link]

This paper discusses the development of Deeponto, a Python package designed to facilitate ontology engineering by integrating deep learning frameworks like PyTorch and Tensorflow with widely-used ontology APIs such as the OWL API and Jena. Traditionally, there has been a compatibility issue since these frameworks are primarily Python-based, while the APIs are Java-based. Deeponto includes a core ontology processing module that is built on the OWL API and encapsulates its fundamental features in a Pythonic way, extending its functionalities to include reasoning, verbalization, normalization, projection, and more. Deeponto also provides tools, resources, and algorithms that support various ontology engineering tasks like alignment and completion, leveraging deep learning methodologies, especially pre-trained language models (LMs).

2️⃣ Meet LongLLaMA: A Large Language Model Capable of Handling Long Contexts of 256k Tokens [Paper] [Github link] [Summary Article]

This research paper discusses the issue of large language models being restrained due to limitations in the effective context length, and presents a solution to address this challenge. The authors propose a method, called Focused Transformer (FoT), which uses an attention layer with access to an external memory, composed of (key, value) pairs. This helps to extend the model's context length. However, they note that as the number of documents increases, the model tends to focus more on irrelevant keys, leading to a problem they term the "distraction issue". To solve this, they employ a training process inspired by contrastive learning, which aids in distinguishing between keys linked to different semantic values. The FoT has been used to fine-tune pre-existing large-scale models, specifically the 3B and 7B OpenLLaMA checkpoints, resulting in new models, named LongLLaMA. These models show improved performance in tasks requiring a long context and can adeptly handle a 256k context length for passkey retrieval.

3️⃣ DreamIdentity: Improved Editability for Efficient Face-identity Preserved Image Generation [Paper] [Github link]

This paper introduces a new, efficient method for preserving facial identities in synthesized images generated by large-scale pre-trained text-to-image models. This is a challenging task that existing methods solve by either time-consuming optimization for each face identity or by learning an efficient encoder which may compromise the model's editability. The authors propose an optimization-free method that retains the model's editability.

To achieve this, the research group from China introduced a novel face-identity encoder (DreamIdentity) designed to learn accurate representations of human faces. It uses multi-scale face features and a multi-embedding projector to generate pseudo words in the text embedding space. In addition, the research team suggests a self-augmented editability learning technique to enhance the model's ability to edit images. This is done by generating paired images of faces and edited faces using celebrity names, with the intention of transferring this capability to unseen faces.

4️⃣ Teaching Arithmetic to Small Transformers [Paper]

This study explores how small transformer models can efficiently learn arithmetic operations, such as addition, multiplication, and square root, through the next-token prediction objective, without any explicit pretraining. The researchers find that traditional training data is not ideal for arithmetic learning and that making simple formatting adjustments can significantly enhance accuracy, leading to pronounced phase transitions based on the scale of training data. They also draw connections to low-rank matrix completion. The study then uses chain-of-thought style data, including intermediate results, which dramatically improves accuracy, reduces sample complexity, and speeds up convergence, even without pretraining. Furthermore, the research considers the interplay between arithmetic and text data during training and examines the effects of few-shot prompting, pretraining, and model scale, along with the challenges of length generalization. The findings underscore the importance of high-quality, instructive data that aligns with the next-word prediction objective for quickly developing arithmetic capabilities.

💻 Github Repos

➡️ paul-gauthier / aider: aider is a command-line chat tool that allows you to write and edit code with OpenAI's GPT models.

➡️ imoneoi / openchat: OpenChat is a series of open-source language models based on supervised fine-tuning (SFT).

➡️ h2oai / h2ogpt: h2oGPT is a large language model (LLM) fine-tuning framework and chatbot UI with document(s) question-answer capabilities.

➡️ ramonvc / freegpt-webui: This project features a WebUI utilizing the G4F API. Experience the power of ChatGPT with a user-friendly interface, enhanced jailbreaks, and completely free.

🏃Let’s Practice Some Generative AI

How To Use Code Interpreter In ChatGPT For Data Analysis (A Step-by-Step Guide)

Let's learn how to use Code Interpreter in ChatGPT for Data Analysis

Step 2: Go to 'Settings' and 'Beta features' to activate 'Code Interpreter.' You may need to have a ChatGPT Plus account.

Step 3: Now upload a dataset by clicking on the '+' button.

We are using a dataset that we downloaded from https://www.kaggle.com/datasets/mathchi/diabetes-data-set for diabetes.

Step 4: Once the dataset is uploaded, you can see the details of the dataset.

Step 5: Now, you can start asking questions to ChatGPT as if you are asking it to a data analyst.

Step 6: You can even ask to create graphs

Step 7: Similarly, you can ask any questions to analyze the data from the uploaded dataset.

🛎️ Trending Tools

Essense.io: An AI tool that synthesizes insights from raw unstructured customer feedback to inform product decision-making.

Framedrop: Framedrop is an AI-powered tool that automatically clips the best moments from your Twitch stream, making it easy to find, edit, and share highlights. Streamers can optimize workflow and engage their audience with this efficient and user-friendly platform.

Notion: Notion is aiming to increase its user base through the utilization of its advanced AI technology. Their latest feature, Notion AI, is a robust generative AI tool that assists users with tasks like note summarization, identifying action items in meetings, and creating and modifying text.

Supermeme: Supermeme.ai is an AI-powered meme generator that allows users to create memes by simply typing in text.

AdCreative AI: Generate conversion-focused ad creatives and social media post creatives in a matter of seconds using Artificial Intelligence

Bardeen: Bardeen is an automation platform that streamlines workflows without coding. Its AI, Magic Box, generates automation based on your input, which can be customized.

Taplio: Transform your LinkedIn presence with Taplio's AI-powered platform. Spend just 10 minutes a day to elevate your personal brand.

Turbocharge your Customer Research with AI!

*This section is presented in partnership with ICL PR

Essense.io is an AI tool that synthesizes insights from raw unstructured customer feedback to inform product decision-making. Its very easy to get started:

Import your customer feedback from sources like the iOS App Store, Typeform, HubSpot Intercom & many more.

Essense’s AI analyzes thousands of pieces of feedback, delivering results in seconds.

Turn unstructured feedback into insights on your customers' sentiments and pain points. You can also chat with your customer feedback to get specific insights!

Essense is like having a data analyst who never sleeps and delivers insights 100x faster.

Customers are the lifeblood of your business. Book a demo to learn more or start a free trial today!

*This section is presented in partnership with ICL PR

🤝 Partner with us

Feature in the world’s fastest-growing AI newsletter from Marktechpost.com and AIToolsclub.com.

Monthly Traffic on AITOOLSCLUB.COM: 100,000+

Monthly Traffic on MARKTECHPOST.COM: 2 Million+

Want to partner and share your tool/product with the AI Community? Email us at [email protected]