This is a simple project that utilizes the open-source Ollama LLM model to demonstrate its ability to help Android developers and publishers generate appropriate responses to user reviews and ratings. By integrating Ollama with a local setup, the project allows you to efficiently analyze and respond to user feedback for your Android app.
Visit my Android DEV site and my apps here.
To get started, clone this repository to your local machine:
git clone https://github.com/vinhDev3006/Android_Reply_LLM.git
cd Android_Reply_LLM
Ensure the following before proceeding:
- CUDA is available.
- You have a Google Play Developer account and an app with a significant number of reviews.
- Docker / Docker Desktop is installed on your machine.
- Python is installed.
To run the Ollama LLM locally, you need to install Docker / Docker Desktop and execute the following commands:
## Pull the ollama image and run the ollama/ollama container in GPU mode.
docker run --rm --gpus=all -d -v ollama_data:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
## Test llama3.2:latest model
docker exec -it ollama ollama run llama3.2:latest
You can download the review dataset from your Google Play Console. Navigate to your app's page and export the review reports.
Once downloaded, you can place the dataset in the data/ folder.
To install the required Python packages, preferrably in a virtual environment, run:
pip install -r requirements.txtRun the data_preprocessing.py script to processes user reviews data:
python .\data_preprocess.py --input_data .\data\reviews_reviews_dev.com.example_202410.csv --output_data .\data\october_reviews_record.csvFinally, run the dev_reply.py script to generates appropriate responses using the Ollama model:
python .\dev_reply.py --data .\data\october_reviews_record.csvThe script will iterate through all reviews and output suggested reply in the console terminal.
The current project only outputs developer replies as plain text in the terminal. Future improvement can include something like integration with the Google Play Console API to automatically post replies.
Feel free to contribute to this project by opening pull requests or submitting issues. We welcome feedback and suggestions to improve the performance and functionality of this tool.
This project includes a Dockerfile and a docker-compose.yml to facilitate easy setup and deployment. It is recommended to use Docker Desktop.
cd Android_Reply_LLM
docker compose build
docker compose up
For more details on the Docker configuration, please refer to the docker-compose.yml file in this repository.
This project contains a simple web app, utilizing the streamlit library, and Google Gemini LLM model. In order to use this website, you will need a Google Gemini API and you can put it in the .env file with the name GOOGLE_API_KEY
To run the web app, execute:
streamlit run .\web_app\web_app.py

