This week I was tasked with testing a long list of items that came my way in a huge Excel file. I have access to an API that gives me a huge JSON response of which I needed some of the data output in a nice format below the CSV cell. After manually pasting the Excel file cell data into the GET request and copying the data from the response twice, I realized that this was HELL. Luckily as I’m learning, there is a way to automate almost everything with Python!

Using the below script I was able to go row by row and plug in those values into my API request and output the values in a column below the original cell in about 4 minutes. It took a little longer than normal but that’s because I had to specify a No Timeout as some of the requests took quite a while.

import urllib.request
import json
import openpyxl

# Open the source workbook
source_wb = openpyxl.load_workbook('source_file.xlsx')
source_ws = source_wb.active

# Open the destination workbook
dest_wb = openpyxl.Workbook()
dest_ws = dest_wb.active

# Paste key here
key = ''
# Paste api endpoint here
url = f'https://your-api-here/with-{key}'

# Loop through the column of items
for cell in source_ws['A']:
    # Get the value of the current cell
    item = cell.value

    if item is None:
        break
    
    print(item)
    
    # Make the API request
    with urllib.request.urlopen(url, timeout=None) as response:
        data = json.loads(response.read().decode())

    # Write the data to the destination worksheet
    dest_ws.append([item])
    if 'seg' in data:
        for segment in data['seg']:

            dest_ws.append([' ', segment['name']])
    else:
        dest_ws.append(['No score returned'])

# Save the destination workbook
dest_wb.save('destination_file.xlsx')

It was a real lifesaver and has been shared across my organization to expedite these one-off requests from clients. What else are you using Python for?

See the code here.