This article explains how Cognexo measures knowledge using dynamic scoring of users and questions, categorising learners into four levels with ranks. It emphasises three KPIs: registration, engagement, and competency, which are reviewed at 3, 6, and 9 months to adjust targets. Setting KPIs after initial data collection ensures realistic goals. Cognexo links KPIs to business outcomes and provides detailed reporting in its Intelligence Portal.
Why a simple percentage score is not enough
Many organisations arrive at Cognexo having measured learning through completion rates or percentage scores — for example, tracking that 75% of staff answered a quiz correctly. On its own, this figure is difficult to act on. It does not tell you whether that question was genuinely hard or trivially easy, whether the same people are consistently getting it wrong, or whether knowledge is holding over time or fading quickly.
Cognexo takes a different approach. Rather than producing a single percentage output per user, it scores both the learner and the question against each other dynamically. A correct answer to a difficult question carries more weight than a correct answer to an easy one. An incorrect answer to a question that most people get right signals a bigger gap than missing a question that trips everyone up.
| Note: This means that two users who have each answered 75% of their questions correctly could be at meaningfully different knowledge levels, depending on the difficulty of the questions they received. |
How Cognexo scores knowledge: ranks and levels
Every user in Cognexo has a rank and level for each topic they are enrolled in. There are four levels — Learner, Improver, Expert, and Master — and each level is subdivided into ten ranks (1–10), giving 40 incremental positions per topic.
| Level | What it means |
|---|---|
| Learner | Beginning to build familiarity with the topic. Questions are matched to the user's current knowledge baseline. |
| Improver | Developing a solid understanding. The algorithm begins to serve harder questions to challenge progression. |
| Expert | Strong, demonstrable knowledge. Capable of applying understanding in practical scenarios. |
| Master | Exceptional and consistent mastery. The go-to benchmark within the organisation for that topic. |
These ranks update dynamically as questions are answered across your user base. A user's rank can go up or down, this is by design. It reflects their knowledge at that point in time, not just an accumulation of past correct answers. This means the data is always current, always reflecting real competency rather than a one-time test result.
Setting your KPI threshold
Every topic in Cognexo has a configurable KPI — a minimum rank that users should reach. The platform default is Improver 6-9, which is a reasonable starting position for most topics.
When you set a KPI, the algorithm takes it into account when deciding which questions to prioritise. Users who are below the threshold will receive more questions from that topic until they reach it. This means a well-calibrated KPI actively accelerates knowledge improvement in the areas that matter most.
| Note: A topic covering critical compliance knowledge may warrant a higher threshold than a topic covering general product awareness. Matching the KPI to the business consequence of not knowing the content is a useful rule of thumb. |
KPI by topic — an example
| Topic | KPI threshold | Rationale |
|---|---|---|
| Regulatory compliance | Expert 5 | High consequence — must be demonstrably known, not just attempted |
| Product range | Improver 9 | Important but lower risk. Aim for solid working knowledge |
| New process rollout | Improver 6 | Recently launched — allow time to build before raising the bar |
Why you should not fix targets before launch
It is tempting to define precise success targets upfront — for example, committing to 85% of users at Expert level within three months. The challenge is that without live data, you are guessing at what is achievable for your specific audience, content, and engagement patterns.
There are two common problems with committing to fixed targets too early. If the target is too easy, hitting it looks like success but does not challenge the organisation to keep improving. If the target is too hard, it creates pressure to explain underperformance before the platform has had time to establish habits.
A more effective approach is to treat the first three months as a data collection phase. Use this time to establish your baseline, how are users registering, how quickly are they engaging, how is the rank distribution moving and then set meaningful targets based on what the data is actually showing you.
KPI review cadence
Cognexo's Customer Success team uses a structured check-in cadence after launch. At each review, the data is reviewed to answer three questions:
- How many users are currently at or above KPI in each topic?
- If we raise the KPI threshold, how many users would fall below it — and how many would the algorithm need to bring back into range?
- Are there topics where everyone is comfortably above KPI, suggesting the bar needs raising or the content needs refreshing?
These check-ins are where KPI strategy becomes meaningful. Rather than managing to a single fixed goal, you are managing a live system — adjusting thresholds as the workforce improves, identifying topics that need more content, and removing or archiving material that has become too easy to be useful.
| Note: Some organisations set KPIs so that a consistent percentage of users — for example 20% — are always working towards the threshold. This means the target moves as the workforce improves, creating a continuous improvement culture rather than a one-time finish line. |
Questions that everyone gets right
During review, it is worth identifying questions where the correct response rate is consistently very high, there are two sensible options here. The first is to move them into an onboarding flow — questions that experienced users find trivial are often exactly the right level of difficulty for someone new to a topic, and saving them for new starters means they still do useful work without occupying space in the main delivery cycle. The second is to retire them entirely — if a piece of knowledge is genuinely universal and stable, continuing to ask about it adds no value.
The reverse is also worth noting: questions that very few people get right may indicate a content problem rather than a knowledge problem. If a question is consistently answered incorrectly across all groups, review whether the question or answer options are clearly written before concluding that the knowledge gap is widespread.
Linking KPIs to business outcomes
Cognexo's data shows you what is happening inside the platform. The business value comes from connecting that data to outcomes outside the platform.
The most effective implementations identify a small number of external metrics that should move as knowledge improves. These do not need to be formally agreed at launch, but having a rough hypothesis like if knowledge in this topic improves, we expect to see X change in Y measure gives you the building blocks for a meaningful ROI conversation.
| Business focus | Cognexo KPI to track | External metric to connect |
|---|---|---|
| Compliance | % at Expert or above in compliance topics | Audit findings, breach incidents |
| Customer experience | Engagement and competency in product and process topics | Customer satisfaction or complaint rates |
| New starter readiness | Registration speed and early rank progression | Time to productivity, 90-day retention |
| Sales performance | Expert-level competency in product knowledge topics | Conversion rates, average order value |