Skip to content

Team Project for CS454(2024 Fall KAIST) - Optimizing LLM prompts for Rust code generation

Notifications You must be signed in to change notification settings

Jin00x/CodePrompt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

51 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EvoPrompt for Code: Extending EvoPrompt for the Rust Code generation

EvoPrompt paper: Connecting Large Language Models with Evolutionary Algorithms Yields Powerful Prompt Optimizers

Abstract

EvoPrompt is a framework for optimizing discrete prompts by combining the language processing power of large language models (LLMs) with the efficient search capabilities of evolutionary algorithms (EAs). By starting with an initial population of prompts, EvoPrompt iteratively improves them using LLM-based generation and EA-inspired selection, achieving state-of-the-art performance on tasks across 31 datasets. This approach automates prompt creation and significantly outperforms human-engineered and existing automatic methods, showcasing the potential of integrating LLMs with conventional algorithms for better optimization.

Motivation

One of the most productive use of generative models is code generation. However, generating high quality code that satisfies all the requirements is a challenging task. EvoPrompt framework doesn't address this aspect of LLM generation; thus, our project aims to extend EvoPrompt to generate the best prompt for the Rust code generation. The project uses the Rust codebase for number of different functionalities as the source code and evolves the prompt to generate the best prompt for the Rust code generation.

Implementation Details

This project implements a genetic algorithm (GA) to generate the best promptfor the Rust code for a singly linked list. The GA evolves solutions over multiple generations to optimize the prompt based on fitness values.

Quick Start

Setup Instructions

  1. Install the required dependencies:

    pip install -r requirements.txt
  2. Add you OpenAI Key to .env file in root directory:

    LLM_API_KEY=<your_key_here>

Running the Genetic Algorithm

  1. Navigate to the directory:

    cd GA
  2. Run the genetic algorithm:

    python ga.py --file <file_name>

Code Details

  • GA/ga.py: Contains the main implementation of the genetic algorithm, including functions for selection, crossover, mutation, and fitness evaluation.
  • GA/error_message_parser.py: Parses the output of Rust compiler errors and test results.
  • GA/llm_api.py: Interfaces with the OpenAI API to generate and mutate code prompts.
  • initial_prompts/init_prompts.json: Contains the initial prompts used to generate the initial population of solutions.
  • **rust_examples/src/{project}/{project.rs} Contains the source code of the project
  • **rust_examples/src/{project}/{test_project.rs} Contains the test cases for project

Testing the source rust code

  1. Navigate to the directory:

    cd ../linked_list
  2. Run the tests:

    cargo test

Additional Information

  • GA/test_capture.py: Captures and parses the output of to extract error codes.
  • GA/test.py: Runs the Rust tests and prints the output.

About

Team Project for CS454(2024 Fall KAIST) - Optimizing LLM prompts for Rust code generation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •