Beware the Emperor’s New Clothes – how to avoid assessment embarrassment


June 19, 2020

Good assessment design is a combination of art and science.  It’s not easy and the quality, rigour and validity of what is produced can vary hugely.  Often clients understandably outsource this type of work to a consultancy and then as non-experts it can be really challenging to measure the quality of what is produced for them.  Getting it wrong can have serious consequences too – from reputational damage with candidates to claims of bias and unfairness in assessment decisions.

With this in mind, here are a few things to look for to ensure that you’ve been given something ‘fit for a king’ and not something that will leave you (metaphorically at least) naked in public! We 

Fabric Choice: Competency Frameworks

  • A good competency framework is the foundation for everything that comes after – getting it right is important.
  • Competencies should always include definitions – why this competency matters – as well as indicators of positive (and often negative) behaviour – what it looks like in practice.
  • Indicators should be behaviourally stated – how you will know if someone has this attribute.
  • The framework should be comprehensive enough to cover all target roles, but not so complex that it is unwieldy and impractical – if the printed framework looks like a telephone directory, will it really be used in practice?

Cutting the Cloth: Assessment Centre Design

  • Best practice in assessment centres is to measure each competency at least twice, ideally three times, in order to give candidates a fair opportunity to show what they can do.
  • This needs to be balanced against assessor workload however – assessing every competency in every exercise is rarely needed and can lead to assessor fatigue.
  • Good centres balance individual and group work, interactive and written elements. Make sure that you’re not inadvertently favouring extroverts/introverts or focusing on analytical skills more than interpersonal ones etc.
  • The AC should give candidates a good insight into the role, company and typical tasks – it should provide a ‘sell’ to candidates and not just be an opportunity for one-way assessment of them.
  • Consider building in some opportunities for feedback and development – giving candidates something positive from the experience, irrespective of their pass/fail result can further enhance the brand and reputation of your business, as well as increase the likelihood that your chosen candidates will accept the offers you make them.
  • Scenario choice is an important decision and one that you should have a say on before any design takes place – will all exercises be set within one overarching scenario, or will each be in a different context? Will this be your actual organisation or a fictitious, parallel one?  Is the scenario(s) chosen reflective of your organisational structure and culture in order to act as a realistic job preview for candidates?
  • Is the proposed design future-proofed to avoid you having to update it regularly?
  • Irrespective of AC length, and number and format of exercises, a good design should consider the needs of all parties – allowing breaks for candidates and assessors, ensuring that facilitators don’t need to magically be in two places at once, and thinking about the logistics of number and size of rooms needed etc.

Stitching Together: Exercise Materials

  • Firstly, check that the designed exercises measure the competencies they’re supposed to? Do candidates have sufficient opportunity to demonstrate their capability on all assessed competencies?  For example, you can’t assess ‘Innovation’ if you haven’t given candidates an opportunity to innovate.
  • Secondly, check that the exercises seem fair and reasonable – would candidates new to your company understand any technical information included, is the amount of preparation time realistic for the volume of material being presented, do the exercises represent a fair level of challenge for the target group etc.
  • If you have parallel version of exercises, check that they are comparable – would two candidates taking the two versions have a similar experience? The level of challenge involved and the tasks required should be consistent, even if the details involved vary.
  • Assessor materials should reflect both the competency framework and the exercise design – check that rating forms don’t include behavioural indicators which it wouldn’t be possible to show in the exercise (e.g. ‘presents confidently’ in a written task). Check that the breadth of the competency is assessed as far as possible.  Check that all indicators are behavioural – they should reflect what candidates actually say and do, and not what their personality is like – you cannot assess ‘has emotional intelligence’ in one indicator on a rating form!

Finishing Touches: Quality Checks

  • Are the materials you’ve been given well-written, without jargon, typos or grammatical errors? Everyone makes these mistakes occasionally of course, but if the work has a high volume of these it can indicate a lack of attention to detail – and if these details are wrong, what else might be …
  • Are all details consistent across materials – do character and organisation names change mid-scenario (we’ve seen this more than once!), do exercises consistently follow-on from each other or does each require the candidate to change role unnecessarily?
  • Are the details involved accurate, is the language used appropriate and is anything going to date too quickly e.g. financial performance data, dates on emails etc.
  • Do the materials ‘look nice’ – what impression will they create on candidates and does that reflect your organisation?

Hopefully, the consultancies you engage with will create beautiful designs that delight and engage you, as well as being robust and fair to your candidates.  If you have any concerns about this, would like your current assessments audited, or would like to discuss the design of new processes and materials, contact us for a chat – hello@sten10.com

Brittany Davies

Pragmatic problem solver and wrangler of data.