
Behind the Build: Skills Tracking
It’s one thing to make it to the end of a course — it’s another to know what skills you’ve actually gained from it. Whether you’re switching careers, leveling up in your current role, or just trying to stay sharp, it can be hard to pinpoint what you know and how well you know it.
We recently rolled out a new skills tracking feature for Codecademy learners, born from user research into our “upskiller” audience — people actively building new skills or deepening existing ones to advance their careers or stay current in fast-changing fields.
“We did a deep dive into our upskiller segment and discovered re-emerging themes about what learners really wanted,” explains Mark Hannallah, Group Product Manager at Codecademy. The research revealed three core pain points: learners wanted to understand what skills they’d gained, assess their proficiency levels and knowledge gaps, and find meaningful ways to apply those skills.
The solution seemed straightforward, but building a system that could automatically extract skills from our diverse content library and deliver precise, skill-based recommendations (beyond just course and path recommendations) was the real challenge our engineering team had to solve. We needed AI to parse and structure skills from an enormous, unstructured content library — something that simply wouldn’t have been possible with traditional approaches.
The project: Create a system that tells learners what skills they have, lets them self-assess, and provides project-based practice opportunities.
This cross-functional effort involved Product, Design, Engineering, and Curriculum teams using generative AI to parse course content, backend services to integrate skill metadata across systems, and front-end experiences to surface skill tracking in learner journeys.
Read on to hear how the team brought this feature to life — from early brainstorming to launch and beyond.
Investigation and roadmapping
Mark Hannallah, Group Product Manager: “The first step in any project is having a clear enough understanding of the problems you’re aiming to solve, and then breaking those into discrete steps to get started. We were trying to diagram out what was a pretty complex problem — the relationship between what you learn, the skills you walk away with, and how those things ladder up. We had lots of FigJam sessions and brainstorms.”
Jerimie Lee, Staff Product Designer: “In the earlier stages, there was a lot of jumping on a call and really taking into consideration the goals that we’re looking at and then each other’s expertise. Everybody on the team has primary expertise, but we also kind of flexed across each other’s domains a little bit.”
Neil Daftary, Engineering Manager: “One of the challenges was philosophical at the beginning where it was like, what is a skill? The language that we were using for it was challenging to define — but also I like that kind of stuff. For me it was it was really fun to be in that space and in the definition stage. I really enjoyed all the FigJams and brainstorming.”
Mark: “I’ll give credit to the process we follow — it’s called the ‘shape up’ process. It’s somewhere between quarterly planning and sprints. We had a really robust approach where engineering, design, and product disciplines came together to talk through different use cases, sketch out flows, and work backwards from there to build out the system requirements.”
Good to know: We define skills as broader capabilities that are built from specific techniques or concepts.
Implementation
Mark: “Prior to this project, we didn’t have structured metadata about skills associated with our content. So with generative AI, we were able to parse all our course content — past and present — and derive the skill metadata. We then structured that across our catalog.”
Neil: “One of the cool things I’d highlight is using generative AI, but with a human in the loop. There was a ton overview that Mark and [Senior Instructional Designer Alex DiStasi] did to make sure our skill metadata and outcomes was correct.
I worked with one of our engineers to write out scripts that basically query our content and generate these nice CSV or Excel files.”
Mark: “There’s also a platform layer: as learners complete content, we needed to track their progress toward skills. That meant iterating on and expanding robust platform services to pass metadata between the front end and back end — essentially tracking how learners engage with skill content.
Then there’s the front end — the user experiences and touchpoints. We had to figure out where in the learner journey the feature should appear and test those ideas with users.”
Neil: “Language-wise, we used JavaScript and on the platform side was Ruby.”
Troubleshooting
Mark: “When designing a system, it’s got to be a complete, functioning system — or else it just doesn’t work. It took a lot of thinking through, discussing, and trying different things to come up with an end-to-end system that works. If something fails, then the system isn’t designed the right way.
That kind of thing can be tough — you think you’ve made progress, and then it’s a couple steps forward, a couple steps back. But for me, we were really grounded in strong user research, and I think we understood what was important to the audience we were designing for.
At the end of the day, we were able to return to the problems and figure out how to solve them. That’s what gave me energy throughout the process. Product teams are in a ton of meetings, but this class of meeting — where you’re creatively solving problems that benefit the user and the business — tends to be more fun and less draining.”
Neil: “One of my main lessons was just the importance of working cross-team — and the ‘contracts’ we formed to deliver these features. It was a learning process, especially since our team hadn’t worked closely with platform historically. Figuring out a good way to collaborate, staying up to date on what’s possible with the current data model and what’s not — that was key.
Feature-wise, it came down to things like APIs — the shape of the data, how we were going to request it, and scalability. We pushed the system to its limits in ways we didn’t expect at the beginning, so performance became a big focus.”
Ship
Jerimie: “Even before this project, one of the key things that added fuel to it was the release of the Skill XP feature. We got really positive feedback from learners and saw statistically significant increases in engagement. That told us, There’s something here. There’s a lot more we could do to help learners see their progress.
The way it happened to be built — partly because we had to scope down and pace ourselves — was that we launched it for Java first. That gave us a mechanism to get input partway through the project as we continued building. So, it wasn’t like we had to wait 8 months to get feedback. We were able to get something out earlier, and I thought that was really nice.”
Mark: “We spent two cycles getting the MVP product to market for a subset of our catalog, and then two more cycles to scale the product across the full catalog. After our MVP launch, we got mostly positive feedback, and people started asking for more: ‘This is really cool, this is what I’ve wanted. Can you do this, this, and this next?’ It felt very validating and fueled us to continue scaling across the full catalog.”
Retrospective
Mark: “I think the pace of development these days is just getting faster. And the threshold for getting buy-in for big, platform-shifting projects is higher because people expect development to happen faster — especially with the use of AI. One learning with any project is staying mindful of scope and figuring out how to get to market as fast as possible.
The other learning is that when you build a system that powers a platform, it’s really important for the organization to understand its value and how it can be leveraged. In any company or product team, you want to be aware of how your features can empower other teams — and making that well known can be a big driver of success.”
Jerimie: “Honestly, the entire thing was a great case study to reference. A lot of the initial ideation definitely came full circle. We slowly engaged more of the organization to get buy-in, eventually pitching upwards to get the official stamp of approval. That process, even though we’ve gone through it in our careers before, felt extremely gratifying in this case given how complex the project was.”
Snaps
Mark: “Definitely the LX [learner experience] team, the engineering team, the platform team came in clutch, and then the leadership team. There was a lot of key support at the right times.”
Neil: “Data science was involved a lot at the beginning in understanding how our content is related to each other, so I’d give them snaps.”
Source link