Running the new Google’s Gemini Model locally

Soham Desai
4 min readJan 2, 2024

--

Picture this: A world-class AI model, made by Google and crafting creative masterpieces, right at your fingertips. Yes, that’s right Google has recently released its first public preview and usage of the new Gemini Pro model from their AI model lineup. Now with a simple setup, you can too run this model locally.

The model is divided into two, Gemini-pro and Gemini-pro-vision, the first being the text-2-text model which specializes in the versatility of text and code generation, the second one being the image-2-text model even images to generate related content from the uploaded image and provides insights from it.

Google AI Website

Now let’s jump right into the setup and implementation!

Firstly, we need the API key for the model, so head over to the Google’s AI Page and click on Get API key in Google Studio. Get your API key by selecting Create your API key in a new project and you are all set for the implementation.

API key dashboard

Now head over to your code editor (better be VScode 🌝) and in an empty folder create a python virtual environment.

python -m venv venv
source venv/bin/activate

Now we make a requirements.txt file for installing the packages.

google-generativeai
streamlit
dotenv

Here, the google-generativeai is a package by Google to access the Gemini model and streamlit is a fast and easy way for creating web applications. Now we install these packages

pip install -r requirements.txt

The last thing is to paste your API key in a .env file

GOOGLE_API_KEY="<your-API-key>"

Now we are done setting up the project let's begin the fun stuff of coding!!

There are actually 2 methods for implementation lets explore both

Method 1: Python Notebook

Here we run the code in python kernel cell in a notebook

  1. Import the packages
import textwrap

import google.generativeai as genai

import os
from dotenv import load_dotenv

from IPython.display import display
from IPython.display import Markdown

2. Define a markdown to display the text

def to_markdown(text):
text = text.replace('•', ' *')
return Markdown(textwrap.indent(text, '> ', predicate=lambda _: True))

3. Load your API key from .env

load_dotenv()
GOOGLE_API_KEY = os.getenv('GOOGLE_API_KEY')
genai.configure(api_key=GOOGLE_API_KEY)

4. To check the models supported

for m in genai.list_models():
if 'generateContent' in m.supported_generation_methods:
print(m.name)

Output :

5. To use the text model

# Selecting text model 
model = genai.GenerativeModel('gemini-pro')

6. To get the response from the model by providing a prompt

# Write any prompt of your choice
response = model.generate_content("What is football?")

to_markdown(response.text)

Output :

Ouput text

7. To get the response in chunks or line-by-line

for chunk in response:
print(chunk.text)
print("_"*80)

Output :

response line-by-line

This is one way to run the Gemini AI locally in your machine.

Method 2 : Using Streamlit

Firstly make a python file as gemini.py

from dotenv import load_dotenv
import os
import google.generativeai as genai
import streamlit as st

load_dotenv()
GOOGLE_API_KEY = os.getenv('GOOGLE_API_KEY')
genai.configure(api_key=GOOGLE_API_KEY)

model = genai.GenerativeModel('gemini-pro')

def generate_content(prompt):
response = model.generate_content(prompt)
return response.text

st.title('Gemini AI Text Generator')
prompt = st.text_input('Enter a prompt:')
if st.button('Generate'):
response = generate_content(prompt)
st.write(response)

Here we make a streamlit application in which we get the API key from the .env file and load it in the GOOGLE_API_KEY variable.

Then we define a simple function generate_content that takes a prompt as an argument and then passes it in the model and returns us the response text.

Finally, we give a text input in stremlit which is given by st.text_input and then a generate button to call the above function and when pressed, it writes the response text below the text field.

Here is the output of the streamlit application .

Streamlit application in localhost

That’s it for now thanking for reading this article hoped you liked it!

You can connect with me on Linkedin and Github!

Cheers!! 🎉

--

--

Soham Desai
Soham Desai

Written by Soham Desai

IT Engineer from Xavier Institute of Engineering | Flutter App Dev | AI dev | Web 3 Enthusiast

Responses (2)