We begin by summoning Gradio — the Pythonic genie of browser-based interactivity. It turns your humble Python function into a GUI wizard without needing to touch HTML, CSS, or lose hours in Flask frameworks and existential crises.
Dr. Wolfram: “Now we define our "human-readable" labels. The machine thinks in numbers (0, 1), but you — being an organic, squishy, language-using organism — prefer something a little more semantic. This is your Rosetta Stone between machine world and meatspace.”
This is the core of our model — the grand decider. The digital judge. The little function that stands between a piece of text and its fate as either "Technical" or "Soft".
Here we break the input sentence down into tokens — the subword units BERT understands. If words are ideas, tokens are syllables. We also specify return_tensors="pt" because PyTorch — like a particular houseguest — always wants its data in tensor form.
Truncation and padding make sure all sequences are the same length — BERT doesn’t do freestyle jazz.
Here we whisper to PyTorch, “Chill — we’re not training today.” Disabling gradient tracking saves memory and computation. It’s like telling your brain, “Don’t bother learning anything right now, just give me the answer and don’t ask questions.”
This is the big moment: the model takes in our tokenized inputs and returns... tensors! Glorious, deeply meaningful tensors! Specifically, outputs.logits, which are the raw prediction scores before softmax smooths them into probabilities.
Here we ask: “Which of these logits is the biggest?” Because in machine learning, bigger is better. We use argmax() to find the index of the highest-scoring class, and .item() to extract it from its tensor-shaped cocoon into a plain ol’ Python int.
This is where machine and human shake hands. The integer class label (0 or 1) is mapped to its corresponding English word via our labels list. Now even your grandmother could understand the output (provided she knows what a 'soft skill' is).
And here, we ride the lightning! This launches a beautiful browser-based interface — built entirely from your code — where anyone can test your classifier. No hosting service, no servers, no firewall configurations. Just pure, delightful, deployable Python.
A single function. A transformer model. A web app. Is this not the future Alan Turing dreamed of?
You’ve just connected the worlds of symbolic computation, deep learning, and cloud deployment — all within 30 lines of Python. Not bad for a Tuesday afternoon.