Scoring Criteria. Each entry will receive a score of 0-30, based on the following criteria: 7.3.2.1. How easy to understand? (0 – min, 10 – max) 7.3.2.1.1. How much is the text used for explanation? (0 points: more than 10 sentences; 1 point: less than 10 sentences and it is relevant; 2 points: text is concise and contributes to understanding of report) 7.3.2.1.2. Are the indicative colors in charts instinctually understandable? (e.g. Red – critical, alert; Green – no issues or mild) (0 points: color scheme is confusing; 1 point: main idea is understandable, but some details are confusing; 2 points: the meaning of the data is completely understandable) 7.3.2.1.3. Does it tell a story? (0 points: no discernible connection between data on report pages or charts on the page; 1 - 2 points: connection between different charts on the same page is understandable; 3-4 points: it is somewhat understandable how the data forms a narrative that answers the challenge; 5-6 points: the narrative and outcome is clear and understandable) 7.3.2.2. How easy to use? (0 – min, 10 – max) 7.3.2.2.1. Are graphs interactive (e.g. clicking on bar in one chart filters the data appropriately in others on the same page)? (0 points: none of the graphs are interactive; 1 point: some graphs are interactive (no more than 3 connections in all pages); 2 points: most of the graphs are interactive) 7.3.2.2.2. Response time. (1 point: there is noticeable lag in switching between report pages; 2 points: there is no lag.) 7.3.2.2.3. Use of advanced functionality: 7.3.2.2.3.1. Filters inside the report (e.g. user can filter the report data) (0 points: no filters; 1 point: there is a filter; 2 points: there is a filter and it is relevant to the challenge; 3 points: there are several filters and they are relevant to the challenge) 7.3.2.2.3.2. Use of layers (e.g. switching between different data representations on same data page) (0 points: no such functionality; 1 – 3 points on implemented use of layers, relevancy to data display and ease of use) 7.3.2.3. How did you like the design? (0 – min, 10 – max) 7.3.2.3.1. Visual design: is the overall look consistent, no empty spaces, no overcrowding? (0 points: not at all; 1 point: in most places; 2 points: fits all points) 7.3.2.3.2. Interface design: are there unnecessary visualisations/buttons/complexity in use? (0 points: consistently and many; 1 point: in some places; 2 points: not observed) 7.3.2.3.3. UX design: is the produced report usable (e.g. clickable)? (0 points: not at all; 1 point: in most places; 2 points: fits all points) 7.3.2.3.4. Report design: is the main challenge answered? (0 points: not at all; 1 point: in most places; 2 points: fits all points) 7.3.2.3.5. Technical: are all the fonts used the same, are the sizes readable? (0 points: not at all; 1 point: in most places; 2 points: fits all points)
Appears in 1 contract
Sources: Contest Terms and Conditions
Scoring Criteria. Each entry will receive a score of 0-30, based on the following criteria:
7.3.2.1. How easy to understand? (0 – min, 10 – max)
7.3.2.1.1. How much is the text used for explanation? (0 points: more than 10 sentences; 1 point: less than 10 sentences and it is relevant; 2 points: text is concise and contributes to understanding of report)
7.3.2.1.2. Are the indicative colors in charts instinctually understandable? (e.g. Red – critical, alert; Green – no issues or mild) (0 points: color scheme is confusing; 1 point: main idea is understandable, but some details are confusing; 2 points: the meaning of the data is completely understandable)
7.3.2.1.3. Does it tell a story? (0 points: no discernible connection between data on report pages or charts on the page; 1 - 2 points: connection between different charts on the same page is understandable; 3-4 points: it is somewhat understandable how the data forms a narrative that answers the challenge; 5-6 points: the narrative and outcome is clear and understandable)
7.3.2.2. How easy to use? (0 – min, 10 14 – max)
7.3.2.2.1. Are graphs interactive (e.g. clicking on bar in one Cross-chart filters filtering implementation across the report. Can other visuals provide relevant data appropriately in others on as the same page)user explores the report? (0 points: none of the graphs are interactivecharts cross-filter; 1 point1–2 points: some graphs are interactive charts cross-filter (no more than 3 connections in all pages); 2 3-4 points: most of cross-chart filtering is fully implemented across the graphs are interactiveentire report.)
7.3.2.2.2. Response time. (1 point: there is noticeable lag in switching between report pages; 2 points: there is no lag.)
7.3.2.2.3. Use of advanced functionality:
7.3.2.2.3.1Drill Down: multi-layer data exploration. Filters inside Can the user drill down and gain additional insights within the report? (0 point: the report (e.g. user can filter the report data) (0 points: no filtersdoes not provide any drill down functionality; 1 point: there is a filter; 1-2 points: there is a filter and it is relevant to the challengesome (1-3) visuals may offer certain drill down interactions; 3 3-4 points: there are several filters and they are relevant to multiple visuals provide satisfactory drill down functionality. 5 points: drill down is fully implemented across the challengeentire report as the main user interaction.)
7.3.2.2.3.27.3.2.2.4. Use of layers (e.g. switching between different data representations on same data page) tutorial overlays and other elements to assist new users. Can a new user start using this report straight away with just the guidance provided within the report itself? (0 points: no such functionality; 1 – point: some assistance elements are implemented, but they do not explain enough to fully onboard a new user; 2 points: assistance elements are fully implemented and provide satisfactory information; 3 points on points: assistance elements are implemented use of layersin a thoughtful and visually pleasing way, relevancy and they fully explain the report to data display and ease of usea new user.)
7.3.2.3. How did you like the design? (0 – min, 10 – max)
7.3.2.3.1. Visual design: is the overall look consistent, no empty spaces, no overcrowding? (0 points: not at all; 1 point: in most places; 2 points: fits all points)
7.3.2.3.2. Interface design: are there unnecessary visualisationsvisualizations/buttons/complexity in use? (0 points: consistently and many; 1 point: in some places; 2 points: not observed)
7.3.2.3.3. UX design: is the produced report usable (e.g. clickable)? (0 points: not at all; 1 point: in most places; 2 points: fits all points)
7.3.2.3.4. Report design: is the main challenge answered? (0 points: not at all; 1 point: in most places; 2 points: fits all points)
7.3.2.3.5. Technical: are all the fonts used the same, are the sizes readable? (0 points: not at all; 1 point: in most places; 2 points: fits all points)
Appears in 1 contract
Sources: Contest Terms and Conditions