Pedantic Journal

Thoughts on AI and other subjects.

AI-powered exam dashboard

image

After over 20 years spent earning Microsoft certifications, and with my recent success on the GH-900 exam, I started reflecting on just how far I’ve come. That’s when the idea for exam-timeline was born. I wanted a fun, interactive way to visualise my certification journey and to try building a simple AI app that used LLMs from GitHub Models.

The initial prototype came together in under half an hour of vibe-coding, thanks to GitHub Copilot for the coding assist and using the Free tier of Azure Static Web Apps for the super-quick deployment and tight GitHub integration. From there, I spent a few more hours automating the extraction of exam data, and wiring up GitHub Actions - mainly delegating the hard work to the GitHub Coding Agent. A few days later, I added the "AI recommendation" feature, using a few lines of Python. The end result is a project that’s both personal and practical, with a workflow that anyone can replicate.

How it works - AI-powered recommendation via GitHub Models

Let's start with the the feature I'm personally most excited about - The AI recommendation to suggest the 'next logical exam'.

image

  • After the transcript is downloaded from Microsoft Learn, the workflow calls a Python script, and this inserts the learner's transcript into the user prompt of the LLM.
  • The system prompt guides the LLM to make a recommendation for a next logical exam, and to avoid choosing an exam that the learner has already completed.
  • The script uses OpenAI's gpt-4o model, which is hosted by GitHub Models (on Azure AI)
  • Because this script is called directly from GitHub Actions, auth works seamlessly. I only had to add models: read into the permissions section of the workflow for it to work.
  • In order to return a consistent response, Structured Outputs are used with the enum type to only return an answer from the list of prioritised exams.
  • The output of this script is a JSON object which includes the recommendation: {"exam_code":"AZ-305"}, which is then inserted into the button in index.html
  • This automation uses the GitHub Models quota, included in the GitHub Copilot plans (including free)

How it works - Data extraction from Microsoft Learn

To extract the exam information, the project uses a Python script to download the Microsoft certification transcript, based on the Transcript sharing code.

Disclaimer: The use of the Microsoft Learn API in this way is not officially supported or documented, and while suitable for a simple hobby project, is not appropriate for a production application. Future API availability is not guaranteed. For commercial integrations, please contact your Microsoft representative.

You can run this script independently if you just want a quick export for your own records or to feed into another tool. But if you’re feeling ambitious, you can clone the entire repo, customise it, and deploy your own version in minutes. The daily automation fetches your latest transcript, stores it in the repo as a .csv file, which feeds a simple Plotly-powered (JS) dashboard. Here's how it gets the transcript from Microsoft Learn:

API_ENDPOINT_TEMPLATE = "https://learn.microsoft.com/api/profiles/transcript/share/{share_id}?locale={locale}"

url = API_ENDPOINT_TEMPLATE.format(share_id=share_id, locale=locale)
headers = {
     # Provide a User‑Agent to avoid potential filtering of generic requests
     "User-Agent": "Mozilla/5.0 (compatible; MSFTTranscriptFetcher/1.0)"
}
response = requests.get(url, headers=headers)
response.raise_for_status()
return response.json()

alternatively, using cURL

curl -f -H "User-Agent: Mozilla/5.0 (compatible; MSFTTranscriptFetcher/1.0)" "https://learn.microsoft.com/api/profiles/transcript/share/${share_id}?locale=${locale}"

If you want to try this yourself, just remember: your Microsoft transcript may contain sensitive personal information, like your name and email address. Thankfully, Microsoft Learn lets you adjust this before sharing anything publicly.

image

Spending time on this project really helped me deepen my knowledge around GitHub Actions, and especially how they can be integrated with GitHub Models for "Continuous AI". It also made me consider how in future, we might AI-enable data like this using broadly adopted standards like MCP. Big thanks to a couple of colleagues for their input and advice on this project, namely fellow PSA, Bojan Vrhovnik and Allison Waldmann from the Microsoft Learn platform team.

Finally, if you're looking to experiment with Python, GitHub Actions, GitHub Models, and Azure Static Websites, I hope my little exam-timeline project inspires you to take pride in your learning journey. Contributions welcome!