Cognition Intensity

Cognition Intensity (CI)

The measure of computational depth per task within an AI system, expressed as the volume of tokens consumed per task multiplied by task frequency. CI captures not just how complex a single task is, but how often that complexity is applied — making it the depth-and-frequency driver of Cognition Throughput.

CI is the variable most directly tied to per-task revenue: deeper, more frequent work consumes more tokens, which translates directly into higher monetization per agent.

In this model:

Tokens are the measurable units through which reasoning, memory access, and output generation are billed and tracked. More complex tasks consume more tokens. More frequent tasks multiply that consumption over time.

CI therefore captures two compounding dimensions:

Task complexity — the amount of reasoning, retrieval, planning, and coordination required per task

Task frequency — how often tasks of that complexity are executed

Together, these determine the total computational effort expended by an agent per unit of time.

High CI indicates that agents are performing deep, sophisticated work. Current examples of high-CI operations include multi-step reasoning and planning chains, tool usage and live data retrieval, long-context analysis across large document sets, and cross-agent coordination where outputs from one agent become inputs to another.

CI produces three observable tiers:

Low CI — simple queries, lightweight automation, single-step outputs. Token consumption per task is minimal.

Moderate CI — structured workflows, multi-step tasks, moderate reasoning depth. Token consumption is meaningful and scales with frequency.

High CI — complex enterprise-grade operations, reasoning-heavy processes, and chained multi-agent execution. Token consumption is substantial and compounds with recurrence.

Where Agent Density determines how many workers are deployed and Loop Persistence determines how long they keep working, Cognition Intensity determines how hard each task pushes them. It is the variable that distinguishes a high-value labor system from a high-volume automation layer.

CI also has a structural tailwind that AD and LP do not: as token prices continue their rapid decline, tasks that were previously cost-prohibitive at high complexity become economically viable. CI expands not only through demand but through the falling cost of depth itself — meaning the ceiling on Cognition Intensity rises with every model generation.

See also: Agent Density, Loop Persistence, Cognition Throughput, Agent ARPU, Tokenized Cognition Model

Read Signal Briefs: The emergence of digital labor economics

Read Frameworks: Agent Experience Integrity (AXI)

Read Lexicon: Digital Labor Economics (DLE)

Read Lexicon: Cognition Throughput

Read Lexicon: Loop Persistence

Read Lexicon: Agent Density

← Back to exmxc Home → Explore Frameworks → Read Signal Briefs