Share
Explore

PYTHON AI Coding Labs

Below is a simple example of how you can use Python to send a prompt to the OpenAI API and display the response.
The way Chat GPT works:
There is the Open AI Language Model: hosted on a server somewhere.
Chat GPT : You are using a Website: This website sends API calls to the Open AI Language Model. This is a PYTORCH TENSOR FILE that we will learn to make in Week 3 using the Python PYTORCH library.
The reason WHY we are doing this lab is to introduce the understanding, tools, and work practices you will be developing to your Assignment and Project in this Course.

Tools:
To work on Local Premises, we need a PYTHON IDE:
We will work on local premises: Visual Studio Code and Anaconda PYTHON distro.

To achieve the high performance compute that we need for AI Model Building: We will use Cloud Services:
Google Collab: Jupyter Notebook in the Cloud.
2. Huggingface Spaces:

Step 1:
Let’s go and access Google Collab

This example uses the requests library to send a POST request to the OpenAI API.
Ensure that you have installed the requests library (pip install requests) before running this code.
Please replace "your_openai_api_key_here" with your actual OpenAI API key.

import openai

# Replace 'your-api-key' with your actual OpenAI API key
openai.api_key = 'your api key'

response = openai.Completion.create(
model="gpt-3.5-turbo-instruct", # or another model version
prompt="What is the capital of France?",
max_tokens=50,
temperature=0.5,
top_p=1,
frequency_penalty=0,
presence_penalty=0
)

print(response.choices[0].text.strip())

Important Note:

Ensure to handle the API key securely and don't expose it in the source code directly for production-grade applications.
Use environment variables or configuration files to manage API keys and other sensitive information securely.
Handle the API rate limits and errors appropriately to ensure the robustness of the code.
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.