Quality Checks for your Power BI Visuals

For more formal enterprise Power BI development, many people have a checklist to ensure data acquisition and data modeling quality and performance. Fewer people have a checklist for their data visualization. I’d like to offer some ideas for quality checks on the visual design of your Power BI report. I’ll update this list as I get feedback or new ideas.

Checklist illustration by Manypixels Gallery on IconScout

The goal of my data visualization quality checklist is to ensure my report matches my intended message, audience, and navigation.

There are currently 4 sections to my PBI data viz quality check:

  1. Message check
  2. Squint test
  3. Visual components check
  4. Accessibility check

Message check

  • Can you explain the purpose/message of your report in a single sentence?
  • Can you explain how each visual on the page supports that purpose/message?

I use the term purpose more often when I have a report that supports more exploratory data viz, allowing users to filter and navigate to find their own meaning in their own decision contexts. Message is much easier to define in explanatory data viz, where I intend to communicate a (set of) conclusion(s). My purpose or message statement often involves defining my intended audience.

If you cannot define the purpose/message of your report page, your report may be unclear or unfocused. If you can’t identify how a visual supports the purpose/message, you may consider removing or changing the visual to improve clarity and usefulness.

Squint test

You can perform a squint test by taking a step back and squinting while viewing your report page. Alternatively, you can use a desktop application or browser add-in that blurs the page, simulating far sightedness. Looking at the blurry report page helps you evaluate the visual hierarchy.

  • What elements on the page stand out? Should they? Areas of high contrast in color or size stand out. People read larger things first.
  • Does the page seem balanced? Is there significantly more white space or more bright color on one side of the page?
  • Does the page background stand out more than the foreground?
  • When visually scanning an image-heavy page (which follows a Z-pattern in Western cultures), does the order of items on the page make sense? Did you position explanatory text before the chart that needs the explanation? If there are slicers, buttons, or other items that require interaction before reading the rest of the page, are they placed near the top left?
  • Is there enough space between the items on the page to keep the page from feeling overly busy? Is it easy to tell where one visual ends and another begins?

Visual components check

I have two levels of visual components checks: reviewing individual visuals and reviewing the visuals across a report page.

Individual visuals components check:

  • Do charts have descriptive, purposeful titles? If possible, state a conclusion in the title. Otherwise, make it very clear what people should expect to find in your charts so they can decide if it’s worth the effort to further analyze them.
  • Are chart backgrounds transparent or using low-saturation colors? We don’t want a background color standing out more than the data points in the chart.
  • Are bright colors reserved for highlighting items that need attention?
  • Are visual borders too dark or intense? If every chart has a border that contrasts highly from the background, it can take the focus away from the chart and impede the visual flow. We often don’t need to use borders at all because we can use whitespace for visual separation.
  • Does the chart use jargon or acronyms that are unfamiliar to your intended audience? Try to spell out words, add explanatory text, and/or include navigation links/buttons to a glossary to reduce the amount of effort it takes to understand the report.

Visual components check – across the page:

  • If your report contains multiple slicers, are they formatted and positioned consistently?
  • Are items on the page that are located close to each other related? Proximity suggests relationships.
  • Are colors used within and across the page easily distinguishable?
  • Are fonts used consistently, with only purposeful deviations?
  • If charts should be compared, are the axis scales set to facilitate a reasonable comparison?
  • Does the interactivity between elements on the page provide useful information?
  • Are visuals appropriately aligned? Misalignment can be distracting.

Accessibility Check

I have a more comprehensive accessibility checklist on my blog that I keep updated as new accessibility features and tools are released. Below are some important things you can check to ensure your report can be read by those with different visual, motor, and cognitive conditions.

  • Do text and visual components have sufficient color contrast (generally, 4.5:1 for text and 3:1 for graphical components)?
  • Is color used as the only means of conveying information?
  • Is tab order set on all non-decorative visuals in each page? Decorative items should be hidden in tab order.
  • Has alt text been added to all non-decorative items on the page?
  • Is key information only accessible through an interaction? If so, you may consider rearranging your visuals so they are pre-filtered to make the important conclusion more obvious.
  • If you try navigating your report with a keyboard, is the experience acceptable for keyboard-only users? Does accessing important information require too many key presses? Are there interactive actions that can’t be performed using a keyboard?
  • Do you have any video, audio, or animation that auto-plays or cannot be controlled by the report user?

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trust DCAC with your data

Your data systems may be treading water today, but are they prepared for the next phase of your business growth?