About DIVA

What is DIVA and what does it measure?

DIVA (Differentiation Index - for - Value Advantage) is an AI-powered diagnostic report that analyses a professional service firm's market positioning (service and value proposition) via the proxy of the company website (i.e. the content a prospect buyer would naturally consume).

Whilst most professional service firm leaders perceive their companies to be competitively differentiated, the reality is often that such subjective, perceived differences are too nuanced. Indeed, for many service categories, a firm's offer - to a prospect buyer - just sounds like everyone else. For such firms, even the best sales outreach, proposal development and pipeline management is going to struggle to convert.

DIVA was built to assess your differentiation objectively - scoring your website content across ten dimensions that collectively determine how clearly and compellingly your business presents itself.

A DIVA Report contains:

  • a summary DIVA score (1-100) - alongside its relative positioning amidst all DIVA report-scores (via a jittered scatter plot)
  • a descriptive summary
  • a qualitative (graded) assessment of ten sub-dimensions
  • a language analysis
  • a visual analysis
  • three actionable insights/suggestions to help firm leaders drive improvement
Who is DIVA designed for?

DIVA is built for the founders, owners, leaders, marketers and commercial leads of professional service firms (e.g. B2B service firms, management consultancies and agencies).

DIVA is especially valuable for teams seeking to obtain an objective view of their current positioning and/or seeking to deliberately improve key aspects of positioning such as clarity, service differentiation, buyer alignment and messaging impact.

How should I interpret my DIVA score and grades?

Your final DIVA score (0-100) represents a holistic picture of your market positioning - as a function of ten assessed dimensions of possible differentiation. The DIVA score is a function of these assessed dimension gradings and a proprietary set of DIVA dimension weightings.

Each of the ten dimensions also receives its own grade (strong, moderate, weak, absent), and accompanying description, detailing the relative strengths-weaknesses of each to steer you as to where improvement activity will have the biggest impact.

You can think of it this way:

  • Strong: only minor refinement, if any, needed
  • Moderate: some differentiation evident but buyers may remain unconvinced
  • Weak: likely major gaps in this dimension's messaging
  • Absent: no evidence of any differentiation in this dimension - easy to address

The improvement recommendations generated directly from these gaps are designed to be specific, actionable and commercially meaningful.

What are the ten DIVA dimensions and why do they matter?

DIVA analyses your market positioning and differentiation across ten dimensions:

  • Validated Market Demand: Assesses whether your service exists within a real, active, independently confirmed market - not a speculative or self-invented niche.
  • Clarity and Specificity of Value Proposition: Measures how quickly a buyer can understand what you do, for whom and why it matters - in clear, specific, buyer-relevant language.
  • Client Persona Focus: Evaluates whether you have defined a clear Ideal Client Profile (ICP) and whether your content speaks directly to that buyer’s context, pain points and priorities.
  • Geographic and Sector Positioning: Assesses whether you have committed to a specific geographic or sector niche, rather than presenting as a generalist consultancy.
  • Commercial Model and Service Packaging: Looks at how your offers are structured, packaged and de-risked (e.g. productised services, defined scopes, innovative pricing).
  • Singular Pain Agitation: Evaluates whether you articulate one clear, urgent buyer pain and highlight the consequences of inaction.
  • Proprietary Methodology and IP: Assesses the presence of a named, structured, repeatable method or toolset that differentiates your delivery.
  • Quantifiable Proven Outcomes: Looks for tangible, numerical results – commercial metrics, timeframes, case study data - that demonstrate credibility, impact and repeatability.
  • The Human Story: Evaluates the purpose, people, culture and human narrative behind your firm - your “why us” and why you exist.
  • Client Conversion Experience: Assesses how easy, clear and confidence-building it is for a buyer to take the next step - your CTAs, first-step offers, onboarding clarity and friction reduction.

Running a DIVA Assessment

What does DIVA analyse on my website?

DIVA analyses the visible, buyer-facing content of your website - the same material a prospective client would read when evaluating whether to engage with you.

  • Homepage and core service pages
  • About, methodology and insight pages
  • Case studies, testimonials and proof points
  • Messaging, structure and narrative flow

DIVA evaluates this content against the ten DIVA dimensions, focusing on positioning clarity, commercial credibility, differentiation and buyer confidence - not technical SEO or design aesthetics.

How much of my website is scraped?

DIVA scrapes a representative subset of your site, prioritising the pages most relevant to buyer decision-making.

Typically, this includes:

  • The homepage
  • Primary service or solution pages
  • About / methodology pages
  • Case studies or proof pages

The goal is not to crawl your entire site, but to capture enough high-signal content as to make a reliable positioning assessment.

What is the minimum content required for a reliable assessment?

For DIVA to generate a reliable score, your site must contain enough meaningful written content to evaluate positioning across all ten dimensions.

As a rule of thumb:

  • A homepage alone is usually not sufficient
  • You should have at least several core pages (e.g. services, about, proof)
  • Thin, placeholder, or highly generic copy will limit scoring accuracy

As an absolute minimum, a DIVA assessment requires at least three pages and 1,500 words of content.

How long does a DIVA assessment take?

Most DIVA assessments complete in 6-8 minutes, depending on:

  • Website size and structure
  • Content volume
  • Website ingestion and processing time

In this current phase of beta testing, however, there is a ‘human in the loop’ (HITL) as required to further evaluate and improve the DIVA model. We aim, however, to return your full DIVA pdf report with two working days of form submission.

How often should I re-run DIVA?

You should re-run DIVA whenever your positioning meaningfully changes, for example:

  • After rewriting your homepage or value proposition
  • When narrowing your ICP, sector or geography
  • After adding case studies, proof or a new service offer
  • Following a rebrand or repositioning exercise

Many users treat DIVA as a baseline → improvement → reassessment tool, using it periodically to track progress.

Data, Privacy and Security

Is my website data stored?

Website content is temporarily processed in order to run the assessment and generate your DIVA report. Once the evaluation is complete, the raw scraped content is not retained as a persistent dataset. We do retain a hash of the ingested content; this is used to detect whether your website has materially changed on any re-run request (if it hasn’t, we simply refer users to the previous DIVA result).

How is my data handled and secured?

DIVA processes website data securely and solely for the purpose of generating your assessment.

Key principles:

  • Data is processed only to deliver your DIVA score and recommendations
  • No website content is shared with third parties beyond the AI services required to perform the analysis. DIVA uses OpenAI’s API under a commercial agreement. Data sent via the API is, therefore, not used to train AI models and is retained only temporarily for operational and security purposes.
  • Content is handled in line with standard security and access controls

DIVA analyses publicly accessible website content only - nothing behind logins, paywalls or private systems is accessed.

Do you store my full DIVA report

Yes - your DIVA report and scores may be stored so you can:

  • View your results again
  • Compare future runs
  • Track improvement over time

Importantly, the stored report contains analysis and scoring, not a reusable copy of your website content. You remain in control of when assessments are run and whether new data is generated.

Commercial

Is DIVA free to use?

DIVA is currently available in a free, early-access, beta format, allowing users to generate a full assessment and provide feedback.

This phase exists to:

  • Validate scoring accuracy
  • Improve report clarity and usefulness
  • Refine the overall DIVA framework

Over time, DIVA may possibly move to a commercial model.

Can agencies or consultancies license DIVA?

We are exploring a (marketing-strategy) agency and consultancy licensing model for DIVA.

Planned options include:

  • Custom branded (white label) reports
  • Use as part of your own proprietary diagnostics or advisory offers
  • Portfolio-level insights across clients

These options are intended for agencies, consultants and advisors who want a repeatable, structured way to assess positioning and commercial readiness for their own end clients and/or to generate interest from prospect clients.

If this is of interest, users can register their interest via curve3consulting.com

Can I get help implementing DIVA recommendations?

DIVA is designed not just to diagnose issues, but to support real improvement.

If you seek support in such improvement activity, we are building up a register of trusted such advisors and experts.

Details of which can be shared on request via curve3consulting.com

Troubleshooting

I would like to feedback on my DIVA Report – how do I best do this?

In this current phase of beta testing, you will be emailed your complete DIVA pdf report. We would love to receive any feedback on the report (email reply) specifically with respect to:

  • Did you find the report accurate?
  • Are you likely to share the report with colleagues?
  • How likely is the report to inspire improvement-activity?
  • General feedback on the report – what you liked, what could be enhanced?