AI-generated “Workslop” is undermining productivity
AI-generated “workslop” looks polished but wastes time, erodes trust, and costs millions in lost productivity.

AI coaching quality depends heavily on context. Too little context leads to generic guidance that managers stop using quickly. Too much ungoverned context creates privacy risk and undermines trust.
What matters for CHROs is understanding which types of context materially improve coaching outcomes, and how those inputs can be handled responsibly.
Through building Pascal and deploying AI coaching in organizations ranging from 100 to 5,000 employees, a consistent pattern emerges.
Platforms that integrate meaningfully with organizational data sustain manager engagement and produce measurable outcomes. Platforms that sit outside day-to-day work reality tend to be explored briefly and then abandoned.
For CHROs, success depends less on whether AI coaching is theoretically valuable and more on whether the platform delivers guidance that managers can apply immediately in their real environment.
Effective AI coaching relies on four specific layers of context. These layers are sufficient to personalize guidance while maintaining appropriate boundaries.
Role scope, goals, performance history, 360 feedback, and career aspirations
This enables personalization without requiring access to personal or sensitive information.
Company values, competency frameworks, leadership principles, and cultural norms
This ensures coaching reinforces how leadership is expected to show up inside your organization.
Meeting dynamics, communication behaviors, and collaboration signals
This reflects how leadership operates in practice rather than how it is described in policy documents.
Performance reviews, goal-setting periods, and feedback moments
This allows coaching to surface when managers are most likely to act on it.
Without these inputs, coaching guidance remains abstract and requires translation effort that most managers do not sustain.
Organizations using AI-powered learning systems enriched with company-specific data report:
Guidance that reflects a manager’s real situation, team dynamics, and organizational expectations is far more likely to be used immediately.
When a manager asks Pascal for help delegating work, the response is shaped by observable context.
Pascal understands how managers typically communicate, which team members are ready for stretch assignments, and what pressures exist on current projects. The resulting guidance aligns with real constraints and real people, which makes it immediately usable.
General-purpose AI tools operate without awareness of your people, culture, or operating norms. As a result, managers receive advice that sounds reasonable but fails to fit their situation.
Managers are then required to repeatedly explain team dynamics, performance history, and cultural expectations in order to get useful output. When guidance conflicts with organizational norms, adoption declines quickly.
Organizations achieve this balance by integrating AI coaching into systems employees already use, such as HRIS, Slack, Teams, and LMS platforms, while enforcing strict governance controls.
Effective safeguards include:
This is why we have designed Pascal to maintain SOC2 compliance, isolate all coaching conversations at the individual level, and never use customer data to train our models.
Advanced AI coaching platforms recognize situations that require human judgment rather than automated guidance.
Escalation should occur for topics such as terminations, harassment, discrimination, medical concerns, and employee grievances. The International Coaching Federation’s 2024 ethics update emphasizes confidentiality, trust, and professional responsibility in AI-supported coaching.
Pascal flags these situations and routes them to HR, reducing legal risk while maintaining employee confidence in the system.
AI coaching delivers value when it fits the reality managers operate in every day. Context makes guidance easier to trust, easier to apply, and more likely to change behavior over time. Without it, even well-intentioned advice creates extra work rather than support.
The most effective platforms are deliberate about the context they use and equally deliberate about the boundaries they set. They rely on work-related signals that improve relevance, avoid data that adds risk without benefit, and make privacy and transparency part of the core system design.
For CHROs, the opportunity is not simply to introduce AI coaching, but to choose an approach that aligns with how leadership actually shows up inside the organization. When coaching arrives in the moment, reflects real team dynamics, and respects employee trust, it becomes part of how managers work rather than another tool they are asked to adopt.

.png)