D401.2 Prosperity4All technical validation: plans and results...D401.2 Prosperity4All technical...
Transcript of D401.2 Prosperity4All technical validation: plans and results...D401.2 Prosperity4All technical...
Ecosystem infrastructure for smart and personalised inclusion
and PROSPERITY for ALL stakeholders
D401.2 Prosperity4All technical validation: plans
and results
Project Acronym Prosperity4All
Grant Agreement number FP7-610510
Deliverable number D401.2
Work package number WP401
Work package title Evaluation framework and technical
validation
Authors SILO team with input from P4A Consortium
(SP2 and SP3 partners)
Status Final
Dissemination Level Consortium and EC
Delivery Date 02/02/2015
Number of Pages 160
Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL
stakeholders
www.prosperity4all.eu
Keyword List
Technical validation, software validation and verification plan, test scenarios, unit testing,
system testing, integration testing, end-users, developers.
Version History
Revision Date Author Organisation Description
1 31/102014 SILO team SingularLogic S.A. Table of Contents, overall TV
approach and TV plan templates
sent to SP2 and SP3 partners
2 25/11/2014 SILO team SingularLogic S.A. First draft including the final TV
approach and the partners’ input
sent up to that point
3 04/12/2014 SILO team SingularLogic S.A. Comments and additional input
3 23/12/2014 SILO team SingularLogic S.A. Complete version submitted to
peer review
2 02/02/2015 SILO team SingularLogic S.A. Final veriosn addressing peer
review comments
Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL
stakeholders
www.prosperity4all.eu
Table of Contents
Keyword List ..................................................................................................................... 2
Version History ................................................................................................................. 2
Table of Contents.............................................................................................................. 3
List of Tables ..................................................................................................................... 8
List of Figures .................................................................................................................... 8
Executive Summary .........................................................................................................10
1. Introduction ..........................................................................................................11
1.1 Purpose and structure of D401.2 .............................................................................. 11
1.2 Interrelation with other project activities ................................................................. 13
2 Requirements for software validation ....................................................................15
2.1 Goals of Technical Validation .................................................................................... 15
2.2 Technical validation activities .................................................................................... 15
2.3 General Requirements ............................................................................................... 16
3 Methodology .........................................................................................................17
3.1 Testing methodology ................................................................................................. 17
3.1.1 Phase 1 – Unit Testing ........................................................................................ 17
3.1.2 Phase 2 - Integration Testing.............................................................................. 18
3.1.3 Phase 3 – System Testing ................................................................................... 18
3.1.4 Phase 4 – Acceptance Testing ............................................................................ 19
4 The technical validation process ............................................................................20
4.1 Iteration phases and Time planning .......................................................................... 20
4.2 Tools and methods used ............................................................................................ 22
5 Software verification and validation plan of SP2 (SVVP – SP2) ................................24
5.1 Introduction ............................................................................................................... 24
5.2 List of SP2 tools, services – modules ......................................................................... 24
5.3 WP201 - System Architecture and Unified-Listing/Marketplace .............................. 25
5.3.1 T201.1 Developer Space ..................................................................................... 25
Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL
stakeholders
www.prosperity4all.eu
5.3.2 T201.2 Unified listing and marketplace architecture and implementation ...... 26
5.3.2.1 Unified Listing ............................................................................................. 26
5.3.2.2 Open Marketplace ...................................................................................... 27
5.3.3 T201.3 Security Architecture and Secure Payment Infrastructure .................... 28
5.3.3.1 Security sub-system .................................................................................... 28
5.3.3.2 Payment sub-system ................................................................................... 30
5.3.3.3 User bid sub-system .................................................................................... 31
5.3.3.4 QoS-Cost negotiations ................................................................................ 33
5.3.4 Τ201.4 Gamification Prototypes and modules .................................................. 67
5.4 WP202 - Building Block Set ........................................................................................ 67
5.4.1 T202.1 Repository of components ..................................................................... 67
5.4.2 T202.2 - AT Specific I/O modules ....................................................................... 68
5.4.2.1 AsTeRICS AT Modules ................................................................................. 68
5.4.2.2 Robobraille Translator modules.................................................................. 68
5.4.3 Τ202.3 - Generic multimodal interaction modules ............................................ 69
5.4.3.1 Open Source Input Transducer Prototyping Module ................................. 69
5.4.3.2 Haptic/Touch I/O Modules ......................................................................... 70
5.4.3.3 Camera input modules ................................................................................ 70
5.4.4 Τ202.4 - Smart device and environment interconnection modules .................. 71
5.4.4.1 Smart home integration modules ............................................................... 71
5.4.5 Τ202.5 - Real time user monitoring modules ..................................................... 71
5.4.6 Τ202.6 - Web based smart personalization and interface adaptation modules 72
5.5 WP203 - Collaborative development tools/ environments ...................................... 72
5.5.1 T203.1 - Development tools for adaptive interfaces for mainstream
applications ........................................................................................................................ 72
5.5.1.1 Model-based development of Adaptive UIs ............................................... 72
5.5.1.2 Usage of web-based self-adaptive UI modules .......................................... 73
5.5.1.3 Generation of applications for the P4A runtime environment .................. 73
5.5.1.4 Extendability of the pattern set available for model-based development 74
Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL
stakeholders
www.prosperity4all.eu
5.5.2 T203.2 - AT Configuration environment ............................................................ 74
5.5.2.1 Web ACS ...................................................................................................... 74
5.5.2.2 Internal model wizard ................................................................................. 74
5.5.2.3 External model wizard ................................................................................ 75
5.5.3 T203.3 – Runtime environment ......................................................................... 75
5.5.4 T203.4 – Guidelines and frameworks for low cognitive and stepping stone
applications for low digital literace ................................................................................... 76
5.6 WP204 - Media and Material Automated/Crowdsourced Transformation
Infrastructures ...................................................................................................................... 77
5.6.1 T204.1 Modularization and Extension of Media Transform Engine .................. 77
5.6.2 T204.2 Making Captioning easier and less costly Developing Low-Cost Crowd-
Corrected Captioning Platform.......................................................................................... 78
5.6.3 T204.3 Modularization and replicability of transformation engine................... 78
5.6.4 T204.4 Modular interfaces to extend function for tables and language
translation.......................................................................................................................... 79
5.6.5 T204.5 Access to Media Enhanced Documents ................................................. 79
5.6.6 T204.6 Mobile Support for Global Media Transformation Platform ................. 80
5.7 WP205 - Assistance on Demand Services Infrastructure .......................................... 80
5.7.1 AoD ..................................................................................................................... 80
5.7.1.1 AoD: User registration and profile management ....................................... 80
5.7.1.2 AoD: Service registration and management ............................................... 81
5.7.1.3 AoD: Listing of Assistance Services ............................................................. 81
5.7.1.4 AoD: Charging models ................................................................................. 82
5.7.1.5 AoD: Multimodal Technical Support ........................................................... 82
5.7.1.6 AoD: Configurable Assistance on Demand Service Network ...................... 82
5.7.1.7 AoD: try-harder, chain of services .............................................................. 83
5.7.2 Social networking and other employment models: Finding other end-
users/suppliers with similar interests and making new friends ........................................ 83
5.7.3 Social networking and other employment models: Chating with friends ......... 84
Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL
stakeholders
www.prosperity4all.eu
5.7.4 Social networking and other employment models: News feed for staying up-to-
date 84
5.7.5 Social networking and other employment models: Supporting crowd-based
services by participating to user groups ............................................................................ 84
5.7.6 Social networking and other employment models: Organization of events and
meetings 85
5.8 WP206 - Sustainable Meaningful Consumers-Developer Connections (Pull vs Push)
85
5.8.1 T206.2: Creation of feedforward mechanisms for directing future development
efforts 85
5.8.2 T206.3 Creation of Consumer Participatory R&D Mechanisms ......................... 86
5.8.3 T206.4 - Creation of Feedback and FeedPeer systems ..................................... 87
5.8.4 T206.5 - Creation of consumer-mainstream communication dimension to
Unified Listing .................................................................................................................... 87
6 Software verification and validation plan (SVVP – SP3) ..........................................89
6.1 Introduction ............................................................................................................... 89
6.2 WP301 - Communication, Daily Living, Health and Accessible Mobility ................... 90
6.2.1 T301.1 - Learning and training s/w applications ................................................ 90
6.2.1.1 Face Tracker Camera Input Module Integration ........................................ 90
6.2.2 T301.2 - Improving access to technology for dementia sufferers/carers .......... 90
6.2.2.1 Apps for people with dementia .................................................................. 90
6.2.2.2 Screens and navigation ............................................................................... 91
6.2.2.3 Video Player ................................................................................................ 91
6.2.2.4 Music Player ................................................................................................ 91
6.2.2.5 Slideshow .................................................................................................... 92
6.2.2.6 Video Call .................................................................................................... 92
6.2.2.7 Info Player ................................................................................................... 92
6.2.2.8 Configuration .............................................................................................. 93
6.2.3 T301.3 - Public Access Points to ICT ................................................................... 93
6.2.4 T301.4 - Pluggable user interfaces for home appliances, home entertainment
and home services ............................................................................................................. 93
Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL
stakeholders
www.prosperity4all.eu
6.2.4.1 Template URC sockets ................................................................................ 93
6.2.4.2 Simple user interfaces for older people and people with mild cognitive
disabilities. .................................................................................................................... 94
6.2.4.3 Proof-of-Concept implementation of the template sockets and the simple
user interfaces in the URCLab (Smart-Home/AAL) laboratory at the HdM ................. 94
6.2.5 T301.5 - Game-based cognitive rehabilitation & maintenance ......................... 95
6.2.6 T301.6 - Routing Guidance System .................................................................... 95
6.2.6.1 Automatic Speech Recognition for dysarthria ............................................ 95
6.2.6.2 MLS Live Services ........................................................................................ 96
6.2.6.3 Stress measurement ................................................................................... 96
6.3 WP302 - Education eLearning, Business and Employment ....................................... 97
6.3.1 T302.1 - Accessible BI ......................................................................................... 97
6.3.1.1 Accessible Business Intelligence navigator ................................................. 97
6.3.1.2 Accessible Business Intelligence Chart reporting ....................................... 97
6.3.1.3 Accessible georeferenced Business Intelligence ........................................ 97
6.3.2 T302.2 - Counselling and printing services......................................................... 98
6.3.3 T302.3 - Accessibility of learning material ......................................................... 98
6.3.4 T302.4 - Special Educational Programmes and learning tools ........................... 99
6.3.5 T302.5 - Integration of Prosperity4All with FLOE............................................. 100
6.4 WP303 – Assistance on Demand Service ................................................................ 100
6.4.1 T303.1 - Consumer assistance on demand system .......................................... 100
6.4.2 T303.2 - Business Assistance on Demand system ............................................ 101
6.4.3 Τ303.3 - Enhancing existing technical AOD services ........................................ 101
7 Conclusions and future plans ............................................................................... 103
8 References .......................................................................................................... 104
9 ANNEX I: Test Scenarios ....................................................................................... 104
9.1 SP2 Test Scenarios ................................................................................................... 104
9.2 SP3 Test Scenarios ................................................................................................... 181
1 0 ANNEX IΙ: Assessment objectives and indicators .................................................. 193
Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL
stakeholders
www.prosperity4all.eu
a. Maturity .............................................................................................................. 193
b. Fault tolerance .................................................................................................... 193
c. Recoverability ..................................................................................................... 193
d. Reliability compliance .......................................................................................... 193
1 1 ANNEX III: Data Collection File ............................................................................. 196
11.1 Sheet 1: Workpackage Information ........................................................................ 196
11.2 Sheet 2: Features, Functionalities & Capabilities of Tool/Service under technical
validation ............................................................................................................................. 196
11.3 Sheet 3: Test Scenarios of Tool/Service under technical validation ....................... 197
11.4 Sheet 4: Responsible Teams .................................................................................... 197
List of Tables
Table 1: List of SP2 tools, servies – modules ............................................................................ 25
Table 2: List of SP3 services ...................................................................................................... 89
Table 3: List of SP2 test scenarios .......................................................................................... 181
Table 4: List of SP3 test scenarios .......................................................................................... 192
Table 5: Workpackage Information ........................................................................................ 196
Table 6: Features and Capabilities ......................................................................................... 196
Table 7: Test Scenarios ........................................................................................................... 197
Table 8: Responsible teams .................................................................................................... 199
List of Figures
Figure 1: Overview of D401.2 ................................................................................................... 12
Figure 2: Interrelation .............................................................................................................. 14
Figure 3: Testing phases ........................................................................................................... 17
Figure 4: Technical validation process (tentative planning) .................................................... 22
Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL
stakeholders
www.prosperity4all.eu
10
Executive Summary
The aim of this document is the definition of the Technical Validation plan, while later
updates of the document will also report on the technical validation results. The
current version of this document consists of:
Chapter 2 – “Requirements for planning software validation”: Identifies the requirements that we have to take into account for a realistic technical validation plan that sufficiently meets project needs. These requirements are general and apply on all software categories, regardless of test phase iteration. Goals of testing and testing activities are also analyzed in Chapter 2 as input needed to define a software validation plan.
Chapter 3 - “Methodology and validation process”: The methodology that we follow in order to establish the technical validation plan is described. Independently of the different software categories, needs, requirements and specifications that relate to the P4All project, the testing methodology is overarching and helps to establish the general partner common understanding, collaboration and testing framework.
Chapter 4 – “Technical Validation Process”: The different verification and validation activities that are part of the validation activities during the project are described. The technical validation process was defined taking into account the different software categories of the project and the user testing iteration phases.
Chapter 5 – “Software verification and validation plan for SP2”: All verification and validation activities to be conducted for software items of SP2 are defined and controlled through the use of the Software Verification and Validation Plan (SVVP). The SVVP is described and specific technical validation plans related to the first user testing iteration are presented.
Chapter 6 – “Software verification and validation plan for SP3”: All verification and validation activities to be conducted for software items of SP3 are defined and controlled through the use of the Software Verification and Validation Plan (SVVP). SVVP is described and specific technical validation plans related to the first user testing iteration are presented.
Chapter 7 - “Conclusions and future plans”: The next steps and the future plans are described.
Chapter 8 – “References”.
ANNEX I – “Test Scenarios”: All the test scenarios for both SP2 and SP3 are described in a table which contains all the necessary information.
ANNEX II - “Assessment objectives and indicators”: The definitions of the assessment objectives and indicators.
ANNEX III – “Data Collection File”: The data collection file that the SILO technical validation team created in order to collect input from partners.
11
1. Introduction
1.1 Purpose and structure of D401.2
The purpose of this deliverable is the creation of a technical validation (TV) plan for the SP2
and SP3 deliverables and the reporting of the technical validation results, after the
validation tests have been carried out, so that they are fed back to each SP2 and SP3 team.
Technical validation is expected to take place during the whole period of the project, given
the many different deliverables that will be produced at different project periods. Time
planning issues are decribed analytically below in this document.
Based on the P4A DoW, two main software categories can be identified that should
undergo technical validation:
The tools/modules that are part of the infrastructure that SP2 is developing Services/Applications (mostly pre-existing) into which SP2 deliverables and other
Developer Space resources may be integrated as part of the SP3 work.
Deliverable D401.2 is the main output of T401.3 of the DoW. The main objectives of
D401.2 are:
To define specific technical validation plans for tools/applications/services developed in the framework of SP2/SP3. All verification and validation activities to be conducted for software items of SP2 and SP3 will be defined and controlled through the use of the Software Verification and Validation Plans (SVVPs).
To develop validation plans relevant to each iteration of user tests and for every software item, in close collaboration with the relevant development team of SP2/3.
To carry out technical verification and validation tests on module/unit/system level and to analyse, consolidate and report the emerging validation results.
The general overview of D401.2 and the subtatsks of T401.3 are summarized in the following figure.
12
Figure 1: Overview of D401.2
For the integration of this deliverable 3 different types of project teams are involved:
SP2 teams develop the tools/modules that are part of the infrastructure that SP2 is
expected to deliver
SP3 teams which are responsible for the development and integration of SP3 work
Technical Validation team (mainly SILO personnel) which is responsible for
orchestrating the technical validation acitvities.
Technical validation is the process which includes all the tests that have to be conducted
for all software development phases. The validation plan includes all the test activities to
be conducted in order to ensure that the delivered software (s/w) (and hardware (h/w),
where applicable) includes the intended functionality and that such functionality operates
properly. As already mentioned above, this document is not the final version of the TV
13
plan. At this point, since no s/w has been delivered for testing yet, this document reports
only on the initial TV plans and does not include any validation results. In Chapter 4.1 below,
time planning issues are discussed in relation to the 4 user iterations phases, as described
in Chapter 3. Thus, this document focuses on the technical validation process and the
Software Verification and Validation Plan (SVVP). The SVVP defines the test scenarios that
have to be conducted and passed in order for the SP2 and SP3 tools and services to be
accepted for user testing.
All the verification and validation activities that have to be conducted will be updated
before each iteration and for every software item in close collaboration with the relevant
development team of SP2/3. At this stage, taking into account the timing and early
development status of all SP2 and SP3 deliverables that are the subject of technical
validation, this document focuses on establishing the general framework of the technical
validation plan and defining the technical validation plans for each SP2 and SP3 deliverable.
Naturally, these plans will be subject to continuous update throughout the project to follow
development progress.
1.2 Interrelation with other project activities
The current deliverable anticipates the progress that will arise from all SP2 and all SP3 work
packages: WP201-WP303, as the task which verifies the results and deliverables of these
WPs. Thus, D401.2 it is strongly connected with the deliverables of the SP2 and SP3.
14
Figure 2: Interrelation
15
2 Requirements for software validation
2.1 Goals of Technical Validation
The goals of technical validation will be to:
Find potential errors/defects in the tools and applications developed within the project;
Technically validate the various aspects of the tools or system components to ensure that they perform as expected;
Reduce both the number of defects delivered and the risks associate with those defects;
Establish confidence that the SP2/SP3 deliverables do what they are supposed and expected to do before they are publicly released and tested by end users under WP402 and WP403. At this point, an important differentiation should be made between the scope of technical validation run on SP2 deliverables in relation to technical validation run on SP3 deliverables, as well as in relation to the scope of the WP402 and WP403 evaluation activities. The technical validation run on SP2 deliverables focuses on those tool/service functionalities that are being developed within the P4A project; for instance, if the purpose of an SP2 tool is to be integrated in an application in order to enhance its accessibility, the technical validation will focus on whether the specific SP2 tool is robust, mature enough, if its APIs are easy to integrate, etc. – in short whether developers can actually integrate the tool in their application. Whether such integration will really enhance the accessibility of the application in which the tool was integrated or not, this is out of the scope of T401.3 and is examined by the evaluation with implementers under WP402. Similarly, the technical validation run on SP3 deliverables focuses only on those new functionalities of the SP3 applications that have been implemented thanks to the use/integration of SP2 tools (or other Developer Space resources) and not on any pre-existing functionalities that were developed out of the P4A project. Namely, the technical validation focuses on whether the integration of SP2 deliverables was successful and not on whether such integration enhances the accessibility of the SP3 application, which is the subject of WP403 evaluation with end users.
Reduce the overall risk within the project.
2.2 Technical validation activities
The following steps/activities provide the cornerstone of the Technical Validation (TV)
process that will be followed in the context of the project.
Define a technical validation strategy. This step covers the formulation and documentation of a technical validation strategy. All aspects of the SP2/SP3 tools and services that require testing; in each case, are identified, indicating the level and form of testing. This present document is meant to address this step, i.e. to define the project technical validation strategy to be kept continuously updated.
Plan the technical validation. This step refers to the estimation of the required resources, such as tools, people, hardware and software infrastructure, and test
16
location. The relevant P4A project teams will define roles and responsibilities, and identify a suitably qualified test manager and test team. This present document is meant to address this step.
Set up the test environment. The team has to set up the physical test environment, the hardware and the software infrastructure, in order to test the system releases in question. Test material will be prepared such as detailed test plans, test scripts, and test data. A ticketing system (see chapter 4.2 – JIRA) has to be set up and a decision taken on the form of defect reporting and any test metrics to use. Test standards and procedures are also defined and appropriate training provided to the team on the use of standards, procedures and testing tools. This present document is meant to address this step in terms of test material definition and provide a plan relating to the set up of the necessary physical test environment subject to future updates. The SVVPs of SP2 and SP3 outcomes are meant to address this step and define the testing environment related to each SP2 and SP2 tool and service.
Perform testing. Testing conducted at the unit, integration and system level, as appropriate (not all levels may be relevant for all SP2/SP3 deliverables). This present document is meant to provide a plan for this step, subject to future updates.
Report back technical validation results. Tests executed and test results collated and recorded in the defect-tracking system. The team will have to analyze the test results and report back to the development teams areas with a high incidence of defects and defect severity so that the development team can track and manage defect resolution. Where appropriate, the tests will be repeated and test materials will be reused to expedite retesting. This present document is meant to define the process and tools for this step.
2.3 General Requirements
First of all the SP4 team has to develop a policy and plans for all the validation activities.
The following steps shall be followed when planning and executing software validation
within the P4A project.
a. Define the objectives of the software technical validation. b. Ensure that a Validation Plan is developed. If necessary, this plan may be
subdivided into lower level plans, subject to the complexity of the respective validation.
c. Enter project and/or product evaluation experiences into the consortium's sharepoint area (wiki, ticketing system, Google drive), to improve the approach to software evaluation.
d. ensure that the validation results can be quantified, clearly presented and are traceable,
e. ensure that suitable and effective technology and best practices are used, f. ensure that the validation is carried out effectively, g. ensure that plans and recommendations supporting all future validation
activities are available.
17
3 Methodology
3.1 Testing methodology
In this Chapter the main testing methodology and the technical validation process are
described. This chapter deals with the general testing methodology that applies to the vast
majority of Software projects and defines how it will be applied in the project.
Four testing phases are identified: unit, integration, system and acceptance. The 4 test
phases and the teams responsible for carrying them out are described below.
Figure 3: Testing phases
3.1.1 Phase 1 – Unit Testing
In the context of this project, this phase is considered as part of the normal development
process and is expected to be conducted by each of the corresponding SP2 and SP3 teams
developing a tool or a service. The development teams will validate the modules that they
developed independently and verify that these modules satisfy the respective
requirements, each by itself. Testing each module separately (i.e. unit testing) usually takes
place in a fake environment, since no interaction between different modules is allowed.
The purpose of unit testing is to verify that each component operates according to
requirements as a single unit. This ensures that the project can move to the next phase –
integration testing – with decreased chances of total failure and – as a result – the need to
move back to development and correct the code. Each SP2/SP3 development team may
18
create a checklist of validation actions/items – test cases - that completely covers all
module requirements. In some cases, automated unit tests may also be created by the
relevant teams (e.g. as specified in chapter 5 and Chapter 6 of this document, this is
expected to be the case with the developer space or several architectural components). As
soon as development is finished, unit testing begins and the team starts executing the
aforementioned test cases. Unit testing results will dictate if the modules are good enough
to move to integration testing or need to be corrected/changed. Unit testing success
criteria should be very strict, close to 100% success, since problems in the individual
modules will most certainly cause blocking issues when the SP2 and SP3 tools and services
is tested as a whole during the integration testing phase. As unit testing is tightly coupled
with the development process of s/w deliverables, it is not dealt with under T401.3 and this
document, but is expected to be part of each corresponding SP2 and SP3 activity within
WP201-WP303.
3.1.2 Phase 2 - Integration Testing
After all modules have been tested and validated, the development teams will move to the
integration testing phase. This is the first time when the software – i.e. all developed tools
and services - is tested as a whole, in an environment closely modelled after the real
operating environment. The tests should be specified at the start of the development phase.
Test cases/scenarios should provide complete coverage of the requirements. Integration
testing might or might not cover interactions with third party products or software SP2 and
SP3 tools and services coming from different development teams. The prime objective is to
test all developed modules coming from one development unit together. Integration
testing success criteria are not as strict as the unit testing success criteria and could also
allow some failures (that should be documented and planned to fix as soon as possible).
In the context of the P4A project, integration testing refers mainly to SP2 deliverables and
not SP3 ones, as, in most cases, SP3 applications are pre-existing applications already fully
functional, i.e. integration testing has already been completed prior to the P4A lifetime.
Integration testing of SP2 deliverables (tools or services) is dealt with in this document and
is within the scope of T401.3. Plans and criteria for integration testing of each SP2
deliverable can be found in later chapters of this document.
3.1.3 Phase 3 – System Testing
The prime objective of this phase is to test integration and interoperation among SP2 and
SP3 tools and services developed by more than one development units/teams. This testing
phase addresses cases where more than one SP2 deliverables are integrated to provide a
more complete user experience: e.g. such may be the case of integration between the
19
Assistance on Demand of WP205 with the payment system of WP201. This phase also
covers cases where one or more SP2 deliverables are integrated by one or more SP3
applications. So, in the context of the P4A project, system testing refers mainly to more
than one integrated SP2 deliverables as well as to SP3 applications integrating one or more
SP2 tools or services.
System testing of SP2 and SP3 deliverables is dealt within this document and is within the
scope of T401.3. Some initial plans and criteria for system testing of SP2 and SP3
deliverables can be found in later chapters of this document, however, it should be
emphasized that they are only preliminary and tentative, given the currently early
development stage of the project, i.e. at this stage it is not fully defined which are the
expected integrations to take place within SP2 and among SP2-SP3. Test cases/scenarios
will continue to be specified and updated throughout the project and are expected to be
more complete towards the later development stages.
3.1.4 Phase 4 – Acceptance Testing
Acceptance Testing is conducted at the same time as integration and system testing and is
meant to determine whether the developed system/tool/service satisfies its acceptance
criteria, as these are described in Annex I – Test Scenario, and to determine whether it is
ready to be tested by end users in WP402 and WP403.
20
4 The technical validation process
4.1 Iteration phases and Time planning
The main objectives of SP4 – Evaluation are detailed in D401.1.
The SP4 Evaluation and Iterative Testing Framework will define and implement a concise
and overall evaluation strategy and framework that provides feedback to all SPs by
conducting different types of iterative evaluation cycles, each of which adopts an iterative
approach.
According to the DoW, there are 4 different evaluation phases during the lifecycle of the
project.
1st iteration of implementers evaluation (starts on M14) 2nd iteration of implementers evaluation (starts on M20) 3rd iteration of implementers evaluation and 1st iteration of end users evaluation
(starts on M33) 2nd iteration of end users evaluation (starts on M42)
Technical validation will always have to be performed on integration, system and
acceptance level 2 months before each iteration with the users. It should be emphasized
that in the context of the technical validation activity, with the term “users” we refer to any
individual or entity to whom the SP2 and SP3 deliverables are addressed, be it developers,
implementers, end-users with or without disabilities, other stakeholders, etc. Explicitly,
provided that there will be no changes in the time-planning of the user evaluation activities
of WP402 and WP403, the timing of each technical validation iteration is as follows:
1st iteration of implementers evaluation (starts on M14) o Technical Validation will take place during M12 and M13.
2nd iteration of implementers evaluation (starts on M20) o Technical Validation will take place during M18 and M19.
3rd iteration of implementers evaluation and 1st iteration of end users evaluation (starts on M33)
o Technical Validation will take place during M31 and M32. 2nd iteration of end users evaluation (starts on M42)
o Technical Validation will take place during M40 and M41.
The reasoning behind the above time-planning of the technical validation is to allow
adequate time for debugging and correction of significant technical failures before the
SP2/SP3 deliverables reach the users.
The exact level on which technical validation will be performed (i.e. integration, system or
acceptance level) will depend on the development progress of each SP2/SP3 team and will
always take into the most updated release of the corresponding SP2/SP3 deliverable.
Naturally, all SP2/SP3 deliverables are expected to reach their maximum maturity level
21
towards the later phases of the project. The technical validation taking place 2 months
before each evaluation iteration with users will focus on those SP2/SP3 deliverables that
will be part of the WP402 and WP403 evaluation (and not on all SP2/SP3 deliverables).
However, in order to ensure that appropriate technical validation feedback is provided to all
SP2/SP3 development teams, in addition to technical validation taking place 2 months
before each evaluation iteration with users, the technical validation team will run technical
validation on any SP2/SP3 deliverable major release occurring anytime before the 2-month
period. For instance, let’s consider the case of an SP2 tool addressed to developers, which,
however, is not selected by any of the SP3 teams for integration. Even though this SP2 tool
will never be part of the user evaluation activities of WP402, it will still be technically
validated and improved as soon as the corresponding SP2 team announces its release (or
update). This continuous validation process will also help the Consortium deal with the
different deadlines of each SP2 and SP3 task/deliverable.
All technical validation results will be fed back to the respective development teams as soon
as the technical validation is complete and will be accessible through the wiki and/or other
project communication fora. The goal is that in the end of each project year, all those
SP2/SP3 tools/services that have been released by their development teams will have been
technically validated at least once by the technical validation team and the results fed back
to the development teams.
However, as noted above, technical validation does not encompass only validation on
integration, system and acceptance level. It also includes unit testing. This will not be
performed in a systematic manner, it will be conducted by each SP2/SP3 development team
each time a new unit is being integrated into a tool/application and will not follow the above
distinct evaluation time-planning and formal reporting process. The technical validation
team will be collecting the relevant results, that are made available by the SP2 and/or SP3
teams throughout the project litetime and will report these along with the system and
integration test results, as a separate chapter.
The following figure provides a tentative time-planning of technical validation and the
expected maturity levels.
22
Figure 4: Technical validation process (tentative planning)
4.2 Tools and methods used
Data collection is one of the most complex tasks in large consortia like P4A. A major part of
T401.3 was the collection and understanding of the technical aspects of the different tools,
services and applications that SP2 and SP3 deliver and the definition of the corresponding
test scenarios in order to make a reliable and to-the-point Software Verification and
Validation plan (SVVP). The Data collection method followed is described below:
1. Data Collection Conference calls: All relevant P4A stakeholders (SP2/SP3 development partners,
SILO as the team coordinating the technical validation and the evaluation partners) participated in conference calls during which the development partners presented their deliverables and answered questions of the validation and evaluation partners. The main outputs of each conference call are the answers to the following questions:
o Which are the SP2/SP3 deliverables that have to be tested? o Which of these are tools addressed to developers or services/infrastructures
addressed to developers and end users or applications addressed only to end users?
o What are the features and capabilities of each tool and application? o How will the technical validation team collect data?
Data collection file: The SILO technical validation team created an excel template (Annex III) to be completed by each SP2/SP3 development team. Using this template the team managed to collect the information about:
o the tools/services that will be delivered by SP2 o the applications that will be delivered by SP3 o the responsible partners/development teams
23
o the main features and capabilities o the technologies o the final users o the test scenarios of each tool under technical validation (subject to update) o the list of deliverables (SP2 & SP3)
Using the data collection template the technical validation team managed to collect a wide
range of test scenarios that can be run throughout the project and the various technical
validation iterations. These scenarios comprise the Software verification and validation
plan (SVVP).
It should be stressed that the test scenarios are only tentative at this stage and greatly vary
in terms of detail and preciseness, i.e. for some SP2 tools that are currently undergoing
their requirements and specifications phase, the test plans are much coarser than for
others that were already mature when included in the P4A project, thus were ready to
enter implementation phase much sooner. Similarly, the SP3 technical validation plans are
very tentative and coarse, given that the Developer Space (i.e. the environment and
resource repository from which they will select tools to integrate in their applications) is
still under definition and development.
2. Ticketing System: JIRA Tracking System
JIRA: When validating software, a tool for incident monitoring and control, tracking, bug
fixing is very helpful. The consortium decided to utilize JIRA – as the sister project Cloud4All
does as well. JIRA provides issue tracking and project tracking for software development
teams to improve code quality and the speed of development. It simplifies every step for
everyone involved as documentation of bugs can be created in seconds from browser, email,
or even smartphone client. Providing searching and reporting capabilities makes easy it for
everyone to manage the status of each project.
The consortium has agreed to use JIRA tracking system during the technical validation phases
in order for every test/result to be followed and observed by the relevant teams.
24
5 Software verification and validation plan of SP2
(SVVP – SP2)
5.1 Introduction
The aim of the Software Verification and Validation Plans as described in this document
(Chapter 5 for SP2, Chapter 6 for SP3) is to coordinate the validation using common
standards, provide the tools to SP2 developers and SP3 implementers and collect the
results and status information.
Technical validation of the SP2 tools to be tested will be conducted in a coordinated way
before the initiation of each iteration phase or after the announcement of each major
release by the respective development team, as explained in section 4.1 above. The
creation and updating of each technical validation plan will take place within the
technology and implementation work packages of SP2 by each relevant SP2 team, under
the coordination of SILO. The actual technical validation testing and reporting will be
conducted, in principle by SILO with the full support of the respective SP2 development
team in terms of training, s/w and h/w required, etc. This is to increase objectivity of testing
in that a third party not involved in the actual development is running the tests. However,
in cases where special h/w is required for running the technical validation or where
installation and testing of the SP2 deliverable is difficult due to other possible reasons,
other arrangements will be made between SILO and the development team(s) – e.g. it is
possible that the technical validation will be assigned to the development team with the
remote participation of SILO, or other similar solution. This will be determined on a case-
by-case basis.
5.2 List of SP2 tools, services – modules
The following table provides a list of the SP2 tasks that will be the object of technical
validation.
For each of the SP2 deliverables, at task level, we provide the following information as part
of the SVVP: a short description of each task and subtask, the testing environment which
includes hardware and software needed for the technical validation to take place, the
expected final users and the specific test scenarios that have to be run during the testing
process.
WP Task Number Name (tool or service)
201
T201.1 Developer Space
T201.2 Unified Listing
T201.2 Open Marketplace for developers
25
T201.2 & T201.3
Payment infrastructure, micropayment and use bid systems,
incl. security modules
T201.4 Gamification Prototypes & modules
202
T202.1 Repository of components
T202.2 AsTeRICS AT Modules
T202.2 Robobraille Translator modules
T202.3 General multimodal interaction modules
T202.4 Smart Device and Environment Interconnection modules
T202.5 Real time user monitoring modules
T202.6
Web-based smart personalization and interface adaptation
modules
203
T203.1
Development tools for adaptive interfaces for mainstream
applications
T203.2 AT configuration environment
T203.3 Runtime environment
T203.4
Guidelines and frameworks for low cognitive and stepping
stone applications for low digital literace
204 T204.1, T204.2, T204.5, T204.6 Extended crowd-corrected captioning platform
T204.3, 204.4, T204.5, T204.6 Extended RoboBraille engine
205 T205.1 - T205.5 Assistance on Demand services and functionalities
206 T206.1 - T206.5 Consumer-Developer connecting modules and services
Table 1: List of SP2 tools, servies – modules
The TV plans below follow the order presented in the above table. The TV plans of
each workpackage and task include the features, the functionalities and the
capabilities of each task and the description of the testing environment. The test
scenarios as defined by the corresponding teams are described in Annex I – “Test
Scenarios”.
5.3 WP201 - System Architecture and Unified-
Listing/Marketplace
5.3.1 T201.1 Developer Space
Description
26
The Developer space connects designers and developers to a larger community of
accessibility and usability resources, expertise and tools. This work includes:
The establishment of a network of contributors from across a variety of open
source projects and communities, addressing the fragmentation of knowledge
that is a typical problem within the accessibility field;
The creation of of collaborative forums/tools that will help to close the gap
between developers and end-users, providing a means for users to influence
and participate within the design, development and testing process more
actively;
A means for searching, browsing, and contributing (i.e. a set of “libraries” or
“shelves”) relevant third-party development tools, frameworks, components,
and open source applications categorized by type of development need;
The creation of a tool to help to localize user interfaces into different
languages, which can be connected to automated translation tools to allow for
crowd sourced correction.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web Browser
Final User: Internal Users / External Users /Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.2 T201.2 Unified listing and marketplace architecture and
implementation
5.3.2.1 Unified Listing
Description
Extending the Unified Listing structure and ontology from assistive technologies to
also include access features in mainstream ICT. Then creating both 1) a manufacturer-
facing interface that makes sense to manufacturers, and allows them to easily enter
and maintain their data,
27
and 2) a quite different consumer-facing interface focused on allowing users to find
products based on their needs. Finally to add mainstream products to the database
through federation with mainstream product data efforts.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web Technologies (JavaScript, json, couch db, etc.)
Final User: 1) Manufactures/Vendors, of both Assistive technologies, and Mainstream
ICT products and services. 2) Consumers with barriers due to disability, literacy, digital-
literacy, and aging.
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.2.2 Open Marketplace
Description
Create an Open Marketplace that:
- is designed to work for people who need assistive technologies, features or services.
- can be used by consumers BEFORE they get their assistive technologies in place
- allows new (and existing) assistive technology and service vendors a place to market
and sell their products internationally
- provides new vendors with a very low entry cost method to market and sell
products.
- allows users to provide micro-payments or bids for new things they would like to
see in the market
- supports purchase or pay-for-use models for assistive technologies
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web Technologies (JavaScript, json, couch db, etc.)
28
Final User: 1) Manufactures/Vendors, of both Assistive technologies, and Mainstream
ICT products and services. 2) Consumers with barriers due to disability, literacy, digital-
literacy, and aging.
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3 T201.3 Security Architecture and Secure Payment
Infrastructure
5.3.3.1 Security sub-system
5.3.3.1.1 Security sub-system: support OAuth2 and SAML identity
standards
Description
The users can employ identifies of different standards to access the services they wish
to use.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o N/A
Needed software: o Web Browser
Final User: Internal users/External users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.1.2 Security sub-system: dynamic configuration and flexible
policy provisioning
Description
The service developers define different policies including different charging schemes
which implies that the security sub-system should monitor differently the diverse
services.
Testing environment and final users
29
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o N/A
Needed software: o Web Browser
Final User: Internal users/External users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.1.3 Security sub-system: scalability and support of diverse
devices
Description
High numbers of supported services and users should be supported. Different types
of devices should be supported.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o N/A
Needed software: o Web Browser
Final User: Internal users/External users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.1.4 Security sub-system: provisioning/deprovisioning
information communicated to the paymnent subsystem
Description
The security sub-system provides the payment sub-system with information about the
service usage per user in order to the latter to calculate the charges and service usage
statistics. This communication is crucial for the support of pay per time charging
model.
Testing environment and final users
30
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o N/A
Needed software: o Web Browser
Final User: Internal users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.2 Payment sub-system
5.3.3.2.1 Payment subsystem: support of three charging models
Description
The end users purchase products and services through the AoD which are charged
following three different models: pay per use, pay per time, pay one-o
ff. The P4A Payment system keeps records of all payments and interfaces a well
established payment gateway (e.g. PayPal) for the real transaction. Micro-payment
(responding to micro-funding processes) are also supported.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: Internal users/External users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.2.2 Payment sub-system: statistics for consumers and
suppliers
Description
P4A payment subsystem maintains statistics on the usage of the offered services at
different granularities and structures.
31
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: Internal users/External users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.3 User bid sub-system
5.3.3.3.1 User bid sub-system: Initiation of new micro-funding
Description
An AoD user (service consumer) initiates a micro-funding process for a new service,
declaring his/her willingness to pay for that service.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: Internal users/External users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.3.2 User bid sub-system: Participation of potential service
consumers to micro-funding process
Description
The AoD users are offered the opportunity to contribute to (initiated) micro-funding
process (some respond positively, others not).
Testing environment and final users
32
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: Internal users/External users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.3.3 User bid sub-system: Participation of potential service
developers to micro-funding process
Description
Service developers having indicated they are interested in micro-funding processes
are informed about the micro-funding process that was initiated by a service
consumer. Few of them place proposals for developing and delivering the service.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.3.4 User bid sub-system: closure of micro-funding process
Description
The micro-funding process ends according to rules defined in D201.1, a service
developper is selected for developing the service and the service consumer having
declared their support are prompted to pay for the service.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
33
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users/developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.3.3.4 QoS-Cost negotiations
Description
End users can search for services and can flexibly set QoS, cost ranges to improve the
results of the service search.
Testing environment and final users
The testing environment must cover the technologies that the development team uses.
More specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
67
5.3.4 Τ201.4 Gamification Prototypes and modules
Description
- Gamification Community Platform: User authentication
- Gamification Community Platform: Code integration
- Gamification Community Platform: Crowdsourcing
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web Browser
Final User: internal users/external users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.4 WP202 - Building Block Set
5.4.1 T202.1 Repository of components
Description
This task will manage the creation of a repository of components. The steps include:
- Create interactive backlog of components
- Define common acceptance criteria and quality standards for inclusion into the public
repository
- Research and track alternative components and processing libraries from outside
contributors and other organizations
- Selection of recommended common standards and APIs
- Iterative assessment of components
- Documentation of components in repository
- Actual creation and management of component repository in content management
system
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware:
68
o Personal Computer (PC) Needed software:
o Web Browser
Final User: internal users/external users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.4.2 T202.2 - AT Specific I/O modules
5.4.2.1 AsTeRICS AT Modules
Description
This task includes developing the strategy for extraction of components of the AsTeRICS framework
so that they become useable without the AsTeRICS middleware. (This outcome is also relevant for
other WP202 subtasks that involve AsTeRICS components). Additionally, in T202.2 the Universal
HID Actuator will be made available as a usable component. The USB HID actuator is is a USB dongle
(hardware) which can behave like mouse/keyboard/joystick devices when plugged into a target
computer's USB port. The HID actuator must be paired via Bluetooth with a control device. After
this, mouse/keyboard/joystick control commands can be sent from control device to the target
machine without installation of dedicated driver software on the target machine.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Bluetooth 3.0 for the control device o USB HID actuator dongle
Needed software: o Java JDK for using the API
Final User: Developers / Implementers (the hardware itself could also be used by end users if
correctly configured and integrated into a product).
5.4.2.2 Robobraille Translator modules
Description
Development and documentation of the following:
- Open Source version of RoboBraille
- Open source license agreement
- Implementation manual
69
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o five high-end desktop compoters running Microsoft Windows; two configured as
web and mail servers Needed software:
o text-to-speech engines, o OCR, o DAISY o MS Visual Studio on .NET 4.x
Final User: Service providers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.4.3 Τ202.3 - Generic multimodal interaction modules
5.4.3.1 Open Source Input Transducer Prototyping Module
Description
The Input / Transducer Prototyping Modules support several low-cost microcontroller platforms
like the Arduino or Teensy controllers which can be used to measure digital or analoge signals. Such
modules are useful to create specialized input devices for people with motor disabilities or to
control actuators in the environment. In course of this task, an API will be developed which enables
to use these controllers in an easy way without the need to program special firmware for the
controllers.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
hardware: o Arduino, Teensy o USB HID actuator dongle
software: o AVR-GCC compile
Download tools: o JAVA JDK for using th API and interfacing with the controllers
Final User: Developers / Implementers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
70
5.4.3.2 Haptic/Touch I/O Modules
Description
This task includes the development of a Haptic/Touch I/O Module that will enable adding
haptic/touch feedback to various User Interface components. Through this module, haptic
technologies could be applied in stand-alone or web-based applications, in order to add haptic
feedback in virtual objects. Then, these virtual objects can be haptically explored using a variety of
haptic devices (e.g. Phantom Desktop, Phantom Omni, Novint Falcon, etc.). Moreover, through the
Haptic/Touch I/O Module, touch feedback technologies could be applied also in mobile
environments (e.g. adding vibration feedback in mobile applications).
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Hardware: o Haptic device(s) like Phantom Desktop, o Phantom Omni or Novint Falcon,
Software: o CHAI3D API o C++ o Java JDK o Drivers for selected haptic device
Final User: Developers / Implementers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.4.3.3 Camera input modules
Description
The camera input modules use a webcam and computer vision algorithms to extract features from
the live camera images, as for example the position of the face of a user or facial features. This
information can be used to create alternative user interfaces.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o OpenCV-Libraries o VideoInput-Library o Microsoft Visual Studio DirectShow (Windows) o Java JDK for using the API
71
Final User: Developers / Implementers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.4.4 Τ202.4 - Smart device and environment interconnection
modules
5.4.4.1 Smart home integration modules
Description
Several different technologies (KNX, FS20, EnOcean) for smart home control / building automation
will be made available via an easy-to-use API. The API covers basic actuation of pre-configured
nodes in the smart home network (for example to turn on a light, control blends, activate the
heating etc.)
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Hardware: o KNX (incl. Actuators, KNX/IP router) o EnOcean components o FS20 sender and actuator
Software: o Java JDK for using the API
Final User: Developers / Implementers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.4.5 Τ202.5 - Real time user monitoring modules
Description
The Components provided herein will manage the communication with the supported biosignal
acquisition hardware units (OpenEEG / OpenBCI and compatible devices for measuring EEG, EMG,
ECG etc.) The raw datastream or pre-processed data (basic signal processing via configurable filters)
will be made available via an API.
Testing environment and end users
The testing environment must cover the technologies that the development team uses. More
specifically:
Hardware:
72
o OpenEEG or OpenBCI compatible biosignal amplifier; Software:
o Java JDK for using the API
Final User: Developers / Implementers
Test scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.4.6 Τ202.6 - Web based smart personalization and interface
adaptation modules
Description
- Validate automated test coverage
- Extend automated test coverage with new or refined UI automation tests
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Web Browser
Hardware: o Personal Computer (PC)
Final User: Internal users/External users/Developers
Test scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.5 WP203 - Collaborative development tools/ environments
5.5.1 T203.1 - Development tools for adaptive interfaces for
mainstream applications
5.5.1.1 Model-based development of Adaptive UIs
Description
Developers are povided with a possibility to define models of Abstract User Interfaces based
on a repository of patterns.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
73
Software: o Modelling framework
Hardware: N/A
Final User: Developers
Test scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.5.1.2 Usage of web-based self-adaptive UI modules
Description
Applications generated by the Developer Tools will among others use the self-adaptive UI
modules created as part of T202.6 to compose fully adaptive user interfaces.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Web technologies (HTML5, CSS, JavaScript)
Hardware: N/A
Final User: Developers
Test scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.5.1.3 Generation of applications for the P4A runtime environment
Description
Applications generated by the Developer Tools based on Abstract User Interface models will
be executable in the Prosperity4All runtime environment provided by T203.3
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Web technologies (HTML5, CSS, JavaScript)
Hardware: N/A
Final User: Developers
Test scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
74
5.5.1.4 Extendability of the pattern set available for model-based
development
Description
Additional abstract descriptions and of ui elements and ui adaptations can be added to the
set of patterns available for the modeling of abstract user interfaces.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Modeling framework o Web technologies
Hardware: N/A
Final User: Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.5.2 T203.2 - AT Configuration environment
5.5.2.1 Web ACS
Description
Web version of AsTeRICS Configuration Suite.
Testing environment and end users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o HTML 5 o Javascript
Final User: AT developers, AT service providers, end users with technical background.
Test scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.5.2.2 Internal model wizard
Description
Integration of predefined user settings and partial models into the Web ACS
75
Testing environment and end users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o HTML 5 o Javascript
Final User: AT developers, AT service providers, hobbyists with technical background, end users
with technical background.
Test scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
5.5.2.3 External model wizard
Description
Wizard for choosing predefined models, which are then directly started in the ARE.
Testing environment and final users
Final User: Clinicians, end users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.5.3 T203.3 – Runtime environment
Description
A REST interface will be defined and developed that allows providing the AsTeRICS Runtime
Environment (ARE) functionalities as REST services that can be accessed remotely by the Web-ACS.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Java o OSGI o Jersey RESTful o Web services framework
Final User: Internal users/External users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
76
5.5.4 T203.4 – Guidelines and frameworks for low cognitive and
stepping stone applications for low digital literace
Description
Framework
A platform supporting run-time interactive components and alowing them to be configured
and interconnected in various ways so as to create apps targetting the needs of people with
low digital literacy and cognitive disabilities. The concept is to allow experimentation and
rapid prototyping in order to help improve research and knowledge in this area.
Components
Interactive 'widgets' based on HTML 5 web components but with extra faciilties requires to
run in the framework and communicate with each other. Design to provide a chunk of
fundamental functionality that can be connected to others. Eg a video player.
Pool
The set of components that can be selected
Designer
An interactive tool allowing components to be discovered, selected, configured and
connected in order to create apps. This is easy to use requiring low skills.
Apps
A collection of components that act together as a cohesive whole, providing clear functionality
to end users.
Gallery
Technical and UIX guidelines for creating components, using the above and knowledge
created
Guidelines
Technical and UIX guidelines for creating components, using the above and knowledge
created
GPII integration
The components and Apps will support GPII auto personaliisation. Details TBD
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o HTML5
77
o Web Browser o Apps o GPII
Hardware: o Personal Computer (PC) o Smartphone o Tablet
Final User: External users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.6 WP204 - Media and Material Automated/Crowdsourced
Transformation Infrastructures
5.6.1 T204.1 Modularization and Extension of Media Transform
Engine
Description
o Conversion, modularization, and documentation of PCF's Amara media transformation tools o Development of a Media Transformation Infrastructure that can support crowdsourced services
and maintenance improvement o Facilitation of open development of new or better modules for the Materials Transformer
Infrastructure/ToolKit. o Build compatibility to sync with additional media hosting platforms o Build compatibility with Amara editor for additional media hosting platforms and media types o Build compatibility with Amara display widget for additional media hosting platforms and media
types
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Python o Django o HTML o CSS o JS
Final User: Internal users/External users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
78
5.6.2 T204.2 Making Captioning easier and less costly Developing
Low-Cost Crowd-Corrected Captioning Platform
Description
This activity will develop a new subtitle collaboration model and software to allow multiple users
to quickly collaborate to create, edit, and publish captions in the 1 minute range or less to allow
captioning of real time events with minimal delay.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Python o Django o HTML o CSS o JS
Final User: External users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.6.3 T204.3 Modularization and replicability of transformation
engine
Description
- Reimplementation of RoboBraille into a set of (1) conversion modules and (2) user
interface modules
- Specification and implementation of a stand-alone version (example implementation)
- Specification and implementation of interface components to facilitate integration
(RoboBraille API)
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Hardware: o five high-end desktop compoters running Microsoft Windows, two configured as
web and mail servers Software:
o MS Visual Studio on .NET 4.x o text-to-speech engines, o OCR,
79
o DAISY
Final User: Developers / Implementers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.6.4 T204.4 Modular interfaces to extend function for tables and
language translation
Description
- Specification and Implementation of uniform OSR module
- Development of prototype OSR module
- Specification and Implementation of uniform language-to-language translation module
- Specification and Implementation of uniform text-to-sign language translation module
- Specification of method to handle rich media contents
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Hardware: o Personal Computer (PC)
Software: o MS Visual Studio on .NET 4.x
Final User: Developers / Implementers
5.6.5 T204.5 Access to Media Enhanced Documents
Description
This activity will bring together the two Content Access teams to work on a method for
handling media rich documents by developing a way to link the two types transformation
engines above.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Python o Django o HTML
80
o CSS o JS
Final User: Internal users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.6.6 T204.6 Mobile Support for Global Media Transformation
Platform
Description
Development activities include: Develop mobile focused interfaces for mobile phones and tablets.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Software: o Python o Django o HTML o CSS o JS
Final User: External users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.7 WP205 - Assistance on Demand Services Infrastructure
5.7.1 AoD
5.7.1.1 AoD: User registration and profile management
Description
Users with different types of disabilities will register in the AoD platform and assess the
easiness/accessibility features
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
81
Needed software: o Web Browser
Final User: internal users/external users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.7.1.2 AoD: Service registration and management
Description
Service suppliers/developers will register and manage the services they offer
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o Web Browser (inspector)
Final User: internal users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.7.1.3 AoD: Listing of Assistance Services
Description
End users (service consumers) will enter the AoD platform and seek for a service matching
their needs. Ranking according to multiple criteria will be tested. The accessibility features of
the platform will be assessed.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
82
5.7.1.4 AoD: Charging models
Description
The end users can check their charges and see that P4A AoD supports τhree different
charging models.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.7.1.5 AoD: Multimodal Technical Support
Description
The end users (both service consumers and suppliers) can select among different modes of
technical support (text, video, speech, online, community based support)
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users/developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.7.1.6 AoD: Configurable Assistance on Demand Service Network
Description
The end users are offered the option to set up a network of assistance services for their
relatives or people they care about.
Testing environment and final users
83
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.7.1.7 AoD: try-harder, chain of services
Description
End users can search for services and look for improved ones based on QoS criteria
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.7.2 Social networking and other employment models: Finding
other end-users/suppliers with similar interests and making new
friends
Description
Service providers/end-users can connect to people with similar interests and make new
friends
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software:
84
o Web Browser
Final User: internal users/external users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.7.3 Social networking and other employment models: Chating
with friends
Description
Service providers/end-users can chat with their friends
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users/developers
5.7.4 Social networking and other employment models: News feed
for staying up-to-date
Description
Service providers/end-users can be automatically informed when an event related to their
interests happens
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users/Developers
5.7.5 Social networking and other employment models: Supporting
crowd-based services by participating to user groups
Description
85
Service providers/end-users can participate to various user groups of their interest or even
create new user groups and invite friends to join, in order to support crowd-based services
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users/Developers
5.7.6 Social networking and other employment models:
Organization of events and meetings
Description
Service providers/end-users can organize events and meetings
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Accessibility tools (e.g. screen reader, speakers, microphones etc)
Needed software: o Web Browser
Final User: internal users/external users/Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.8 WP206 - Sustainable Meaningful Consumers-Developer
Connections (Pull vs Push)
5.8.1 T206.2: Creation of feedforward mechanisms for directing
future development efforts
Description
Survey tool connected to online communities that already validate ideas. A double check on
ideas previously rated in other sites will be integrated through surveys. Hot topics of
challenges proposed in sites such as innocentive will be directly channeled
Testing environment and final users
86
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o PHP o MySQL
Final User: external users
Test Scenarios
Luisa, from the national blind association finds the need to develop a mobile app that can
translate the color of YouTube videos for colorblind people. This is a rather complicated task
due to the wide variety of colorblind features. Nonetheless they publish this idea in kickstarter
and simultaneously ask for developers willing to take this task in crowdsourcing portals. The
national blind association is active in informing developers about the different profiles and
color combinations, all this done with the P4all tool. When there is an initial prototype this is
tested also within the P4all scheme. Surveys, remote testing and interviews are done with the
P4all tool in order to tune up the final result.
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.8.2 T206.3 Creation of Consumer Participatory R&D Mechanisms
Description
Developers will be able to ask costumers about their needs through this survey tool.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o PHP o MySQL
Final User: external users / Developers
Test Scenarios
Brenda is Carl´s sister; she is looking for ways to improve Carl´s communication since he is
deaf. She has heard about the possibilities of haptic communication. According to Brenda, if a
device like a mobile phone or a wristband can vibrate in different ways meaning different
things, that shows a lot of possibilities. Brenda contacts a group of developers who are
familiar to this technology and opens a fundraising campaign in Kickstarter. The requirements
87
capture is done through Prosperity4all tool as well as the initial prototypes and final tests.The
test scenarios are described in ANNEX I table: “Test Scenarios”.
5.8.3 T206.4 - Creation of Feedback and FeedPeer systems
Description
This tool will pose the main questions related to the software testing as well as conveying
that feedback. The questions and answers will be properly directed between consumers and
developers
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o PHP o MySQL
Final User: external users / Developers
Test Scenarios
LOCAM is an active foundation on the field of incorporation of cognitive disabled people to
the labor force. They have realized the potentialities of augmented reality in the field of
cognitive accessibility. Due to the fact that they have already worked with the use of tablet by
disabled users at work in order to get instructions, order tasks etc. LOCAM want to help to
develop an app that can give instructions about the use of devices when taking a picture of
them. By adapting an explanation on the use of a photocopy and laundry machines only when
shooting at them and different components this app may start a revolution in helping people
with cognitive problems to accomplish tasks at work.
The test scenarios are described in ANNEX I table: “Test Scenarios”
5.8.4 T206.5 - Creation of consumer-mainstream communication
dimension to Unified Listing
Description
Connection with the main market place in order to enable and convey feeback of different
accessible services & AT
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
88
Needed hardware: o N/A
Needed software: o PHP o MySQL
Final User: external users / Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
89
6 Software verification and validation plan (SVVP –
SP3)
6.1 Introduction
The aim of the Software Verification and Validation Plan is to coordinate the validation using
common standards for the SP3 services. The services that SP3 has to deliver are summarized
in the following table:
WP Task Number Name (tool or service)
301
T301.1 Learning and training s/w applications
T301.2 Improving access to technology for dementia sufferers/carers
T301.3 Public Access Points to ICT
T301.4
Pluggable user interfaces for home appliances, home entertainment and home
services
T301.5 Game-based cognitive rehabilitation & maintenance
T301.6 Routing Guidance System
302
T302.1 Accessible BI
T302.2 Counselling and printing services
T302.3 Accessibility of learning material
T302.4 Special Educational Programmes and learning tools
T302.5 Integration of Prosperity4All with FLOE
303
T303.1 Consumer assistance on demand system
T303.2 Business Assistance on Demand system
T303.3 Enhancing existing technical AOD services
Table 2: List of SP3 services
The TV plans for the SP3 services are described below. At the moment the plans are very
tentative and preliminary as we are still at an early implementation phase in the project. It
should be reminded that most of the SP3 implementation teams are going to integrate SP2
tools/services and this work can only start as soon as SP2 teams start delivering prototypes
and APIs ready for integration, which is only starting within the first project year.
Consequently, the definition of what will be technically validated in terms of SP3
implementation is only tentative at this stage. Nevertheless, the backbone TV plans are
described below following the order of the table.
90
6.2 WP301 - Communication, Daily Living, Health and
Accessible Mobility
6.2.1 T301.1 - Learning and training s/w applications
6.2.1.1 Face Tracker Camera Input Module Integration
Description
The FaceTracker camera input module will be integrated into the FlashWords software
prototype. Testing the operation of the module with different head movements.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o FaceTracker camera input mobile o Windows PC/Tablet with webcam o Needed software: o Actionscript 3.0 running in an Adobe AIR-Runtime Environment o Locally developed ATLab Framework o Integration of foreign modules with ANE
Final User: Internal Users and End-user
Test ScenariosThe test scenarios are described in ANNEX I table: “Test Scenarios”
6.2.2 T301.2 - Improving access to technology for dementia
sufferers/carers
6.2.2.1 Apps for people with dementia
Description
Based on the core functionalites of Maavis but using the P4A components to get a HTML web
component solution with built in GPII auto-personalisation.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o Web Browser
Final User: End-users
91
6.2.2.2 Screens and navigation
Description
A simple and consistent navigation model consisting of screens of buttons organised into 3
layers: activity, Collections, players.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o Web Browser
Final User: End-users
6.2.2.3 Video Player
Description
A video player with simple controls for volume will be developed.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o Web browser
Final User: End-users
6.2.2.4 Music Player
Descripion
A music player with simple controls
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o Web browser
Final User: End-users
92
6.2.2.5 Slideshow
Description
A slideshow player with simple controls
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web browser
Final User: End-users
6.2.2.6 Video Call
Description
A video call with simple controls for volume etc
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web browser
Final User: End-users
6.2.2.7 Info Player
Description
A web broweser with simple controls
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web browser
Final User: End-users
93
6.2.2.8 Configuration
Description
A tool allowing the selction of media and basic configuration options
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web browser
Final User: End-users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
6.2.3 T301.3 - Public Access Points to ICT
Description
The application will control the pc with head movements
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web browser
Final User: End-users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
6.2.4 T301.4 - Pluggable user interfaces for home appliances,
home entertainment and home services
6.2.4.1 Template URC sockets
Description
Abstract user interface descriptions for common devices in households (e.g., HVAC, TV,
Entertainment). URC sockets are used to build concrete device- and user-specific interfaces.
94
The template sockets contain the basic abstract user interface information, that most of the
devices, within a group e.g.; HVAC devices, have in common. Therefore, the template sockets
can be used as a foundation by developers to build concrete user interfaces for a many
devices in households.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o URC (XML)
Final User: Developers
6.2.4.2 Simple user interfaces for older people and people with mild
cognitive disabilities.
Description
We will provide simple web user interfaces for the template URC sockets.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o Web Technologies (HTML, CSS, JavaScript)
Final User: End-users
6.2.4.3 Proof-of-Concept implementation of the template sockets and
the simple user interfaces in the URCLab (Smart-Home/AAL) laboratory at
the HdM
Description
We will install real devices (of common devices in households [e.g., HVAC, TV, Entertainment])
in our laboratory (AAL-Lab) at the HdM and use them as a test-bed for the template sockets
and the simple user interfaces
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
95
Needed hardware: o Real devices o Connection Technology (e.g., EnOcean, ZigBee, Bluetooth)
Needed software: o Web Technologies (HTML, CSS, JavaScript)
Final User: End-users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
6.2.5 T301.5 - Game-based cognitive rehabilitation & maintenance
Description
Sociable is an application addressed to elderly people with light to mild Alzheimer’s disease
and their carers, offering game-based cognitive training activities to the elderly and
monitoring capabilities to their carers. Sociable is running on Microsoft Pixelsense platform
and Windows tablets.
Within the P4A project, more input/output modalities will be supported in the application
thanks to the integration of SP2 tools. Specifically, the following components will be targeted
for integration in the first project phase: haptic I/O and camera input module.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o MS Pixelsense platform or any Windows-based tablet. o Camera o Haptic device
Needed software: o Sociable application
Final User: External users
6.2.6 T301.6 - Routing Guidance System
6.2.6.1 Automatic Speech Recognition for dysarthria
Description
ASR enables navigation command prompt (desired destination) to the system apart from
default option (typing)
Testing environment and final users
96
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Android device
Needed software: o Android SDK
Final User: External users
6.2.6.2 MLS Live Services
Description
Services such as weather conditions, PoIs(gas stations, pharmacies, e.t.c.), news are available
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o Service
Final User: External users
6.2.6.3 Stress measurement
External device (USB) or application detecting stress
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o External USB device
Needed software: o Android SDK
Final User: External users
Test ScenariosThe test scenarios are described in ANNEX I table: “Test Scenarios”
97
6.3 WP302 - Education eLearning, Business and Employment
6.3.1 T302.1 - Accessible BI
6.3.1.1 Accessible Business Intelligence navigator
Description
We will prodive a simplified Business Intelligence tool to access available reports using an
adaptable accessible interface.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed: FLUID Technology
Final User: End users
6.3.1.2 Accessible Business Intelligence Chart reporting
Description
Will integrate an accessible charting library with support for visual impaired access to
business intelligence data
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed: Amcharts
Final User: End users
6.3.1.3 Accessible georeferenced Business Intelligence
Description
Will integrate tools to allow visual impaired end users to access location intelligence reports
with the use of assistive tecnologies.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed: Describer (or similar)
Final User: End users
Test Scenarios
98
The test scenarios are described in ANNEX I table: “Test Scenarios”.
6.3.2 T302.2 - Counselling and printing services
Description
People with blindness do not have access to this kind of visual information. Thus, it is
important to make graphics accessible to blind people. One way to make graphics accessible
is to create tactile graphics, which can be produced by special printers. Graphical elements
such as mathematical curves are printed in such a way that they can be read with the fingers.
To allow cooperation with sighted people, tactile graphics are also printed in color and
provided with regular script. The goal of this service is to offer a printing service to blind people
with higher education throughout Europe. Establishing such a print service requires putting
the different steps into a semi-automatic workflow. In a first step, the SZS creates a workflow
which can be used by other partner organisations within Europe.
Followed by setting up a web server (Linux) which automatically works out the developed
process. Process: a) An external user is uploading a printable source file (SVG); b) An employee
of the SZS is checking, accepting and approving the the file. c) The last part of the process
starts: The web tool is printing all material including bill, envelop etc. d) Material will be send
to the user. Within this workflow we would net access to a micropayment tool and
marketplace to offer this service.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o linux based webserver (internal), o tactile printerNeeded
software: o JAVA, o Pearl/Python
Final User: Internal users/External users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”.
6.3.3 T302.3 - Accessibility of learning material
Description
99
The adaptation service of the SZS is available to blind and partially sighted students as well as their
lecturers. A team at the SZS adapts not only lecture notes, scientific papers, presentations and
different documents, but also exams, books and further study-relevant material necessary for the
participation in lectures and examinations. The study material is prepared electronically. It can be
read at a computer with a braille display or a screen reader – alternatively also printed in braille.
Many diagrams are additionally made available with textual description and/or printed as a tactile
graphic. All converted material is archived electronically to be available for future visually impaired
students. The adaptation itself takes place according to current criteria of scientific text adaptation
and is being constantly further developed and/or tailored to the needs of the students. The SZS
would offer various workshops about the accessibility of mathematics, scientific documents,
graphics and computer science. SZS has collected and developed a vast amount of different
technologies/tools and teaching material about IT & AT technologies in this area. Those can be
contributed to a Unified Listing. As well adapted to the developed web services within the P4A
project.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Training computers
software: o eLearning platforms o web services to adapted tools
Final User: Internal users/External users
6.3.4 T302.4 - Special Educational Programmes and learning tools
Description
Integrating the Input Transducer prototyping module (FHTW) into the FlashWords softwareprototype. Testing the function of different sensors and alternative input devices.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Input Transducer prototyping module (FHTW) o Windows PC/Tablet o Microcontrollerboard (Arduino, Teensy) o Sensors (e.g. bend, force, gyro, pressure, accelerometer)
Needed software: o Actionscript 3.0 running in an Adobe AIR-Runtime Environment o Locally developed ATLab Framework o Integration of foreign modules with ANE
100
Final User: Internal Users and End-user
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
6.3.5 T302.5 - Integration of Prosperity4All with FLOE
Description
- Validate automated test coverage
- Perform manual QA tests based on documented test plans
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Personal Computer (PC)
Needed software: o Web Browser
Final User: End-users
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
6.4 WP303 – Assistance on Demand Service
6.4.1 T303.1 - Consumer assistance on demand system
Description
Tool that can convey user demands in a rapid way (one click). A relation with the user pannel will
be made in all components
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o PHP o MySQL
Final User: external users / Developers
101
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
6.4.2 T303.2 - Business Assistance on Demand system
Description
Tool that can convey user demands in a rapid way (one click). A relation with the user pannel will
be made in all components.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o N/A
Needed software: o PHP o MySQL
Final User: external users / Developers
Test Scenarios
The test scenarios are described in ANNEX I table: “Test Scenarios”
6.4.3 Τ303.3 - Enhancing existing technical AOD services
Description
Integrating the AoD module (TECH) into the FlashWords softwareprototype. Simulating different user problems and solutions.
Testing environment and final users
The testing environment must cover the technologies that the development team uses. More
specifically:
Needed hardware: o Windows PC/Tablet o Microphone o Web Camera
Needed software: o Actionscript 3.0 running in an Adobe AIR-Runtime Environment o Locally developed ATLab Framework o Integration of foreign modules with ANE
Final User: Internal Users and End-user
Test scenarios
102
The test scenarios are described in ANNEX I table: “Test Scenarios”
103
7 Conclusions and future plans
This document is the first version of the D401.2 “Prosperity4All technical validation: plans and results“. As it has already been mentioned in Chapter 4, there are four different evaluation phases during the lifecycle of the project. The aim of D401.2 is to define specific technical validation plans for tools/applications/services developed in the framework of SP2/SP3 that will be updated throughout the project as development evolves. The definition of the technical validation plans at this phase of the project was a hard task because a lot of tools/applications/services are in an initiation phase, thus many aspects of the technical validation plan for each SP2 and SP3 tool/application are under continuous formulation. Despite this difficulty, this version of D401.2 succeded in defining initial Software Verification and Validation Plans for all SP2 and SP3 deliverables. The descriptions, the testing environment and the test scenarios for each task and sub-task are described and the backbone of the technical validation is the main part of this document. As described in Chapter 4, technical validation will always have to be performed on integration, system and acceptance level 2 months before each testing iteration with the users. Currently, the Consortium is defining which SP2/SP3 tasks will deliver prototypes for the first user testing iteration so that their technical validation can start shortly. The technical validation team is ready to run test scenarios on the chosen SP2/SP3 deliverable based on the presented TV plans and their possible refinements and provide the results to the development teams for improvement. The results of this exercise will also be available in the wiki and in the developer space. In the meantime the technical validation team will prioritise and categorise the existing test scenarios based on the availability of SP2/SP3 prototypes and will set up the collaboration environments with each SP2/SP3 team so that the technical validation can be run. In cooperation with the SP2 and SP3 partners the technical validation team will prioritise and categorise the test scenarios according to the next criteria:
o the progress of the SP2 and SP3 deliverables, o the scheduled delivery plan, o the interrelation of the deliverables, o the ability to create the testing environment
All the tests that will be run in the first phase will be on new releases or updated s/w that SP2 or SP3 team announces. The validation process will be continuous and all the methods and tools that described above will be used.
104
8 References
1. ISO/IEC - 14598
2. ISO_IEC_9126
3, ISO_IEC_25062
4, Process Pro – “10 Steps to Software Validation”,
5. ESA Board for Software Standardisation and Control (BSSC) – “Guide to software verification and validation”
104
9 ANNEX I: Test Scenarios
The test scenarios for each task and subtask of SP2/SP3 tools/services are summarized in the following tables. During the testing of each
task/subtask the technical validation team will complete the “Assessment Objective” and “Indicator” columns. According to these the technical
validation team will deliver the testing results to the SP2/SP3 team for review and updates.
9.1 SP2 Test Scenarios
WP201 – System Architecture and Unified Listing / Marketplace
T201.1 Developer Space
Test Number Assessment
Objective
(Annex II)
Indicator
(Annex II)
Test description Requirement Steps to execute Expected results Status
WP201_T201.1_TEST1 Contribute to Develeloper Space
automated test coverage
Each testing
interation, add new
unit and
acceptance tests to
the automated test
suite.
The resulting automated
tests will help to ensure
that the Developer Space
can be tested and
evaluated continuously
during development, prior
to deployment of new
releases, and in
preparation for major
project milestones such as
pilot testing.
Not yet
begun
105
WP201_T201.1_TEST2 Validate automated test
coverage
Prior to major
project milestones
such as pilots, all
unit and
acceptance tests
will be run in
supported
browsers. The
anticipated list
includes: the latest
versions of Chrome,
Firefox, Internet
Explore and Safari
on Mac and
Windows
All tests should pass Not yet
begun
WP201_T201.1_TEST3 Verify Developer Space
search functionality
Based on a cross-
section of to-be-
determined (at
each Developer
Space release
milestone) key
queries and results,
ensure that the
expected
components and
tools are returned
when searching.
The expected components
and tools should be
returned by the Developer
Space search engine,
ranked appropriately.
Not yet
begun
106
T201.2 Unified listing and marketplace architecture and implementation
Test Number Assessment
Objective
(Annex II)
Indicator
(Annex II)
Test description Requirement Steps to execute Expected results Status
WP201_T201.2a Test 1
Test of the Human interface of
Unified Listing
Go to Unified
Listing . Take a
sample of the
personas from
Cloud4all and
Prosperity4All.
Search database for
solutions. Then
make up a person
facing barriers due
to disability and
search for solutions
for them.
Should find both assistive
technology and
mainstream features that
relate to the needs of each
personna, and the new
example person.
Not yet
begun
WP201_T201.2a Test 2
Test of API to the Unified
Listing
Send query to the
Unified Listing API
asking for data on
all products and
services
Should get a full listing. As
a quick check of the
completeness, all results
found in test 1 for those
individuals should be
present in the full data
dump. The number of
records obtained should
Not yet
begun
107
also match the number of
complete records reported
in the Unified Listing, and
all records should be intact.
WP201_T201.2b Test 1
Test of the human interface of
the Open Marketplace
Go to Open
Marketplace. Look
up products from
the Part a tests that
were identified as
being in the Open
Marketplace.
Check to see if
sufficient
information is
provided on the
product to allow
purchase (or rent as
appropriate). If
none of the
products for the
people chosen in
part a test are in
Open Marketplace,
then just browse
the Open
Marketplace for
products
All information should be
available in the
marketplace with sufficient
information to either
purchase or to look further
into the product.
Not yet
begun
108
WP201_T201.2b Test 2
Test of accessibility of the
human interface of the Open
Marketplace
Standard website
accessibility tool is
used to evaulate
site against WCAG
2.0
Passes WCAG 2.0 at level
AA
Not yet
begun
WP201_T201.2b Test 3
Test of ability to use the
human interface of the Open
Marketplace without AT.
Go to Open
Marketplace. See
whether the site
can be used
(without using any
assistive
technologies or
access features not
provided by the
site) when:
- tester is
blindfolded (or
blind)
- tester uses a
keyboard (or
alternate keyboard)
only to access the
site
- tester has sound
turned off on the
computer (or is
deaf)
All users should be albe to
use the site without any
external Assistive
technologies
Not yet
begun
109
- tester has site set
to a scramble font
(a testing font that
scrambles random
charaters on screen
so that it is not
easily or accurately
readable) or tester
has severe print-
language disability.
- tester has the
screen located 5
meters away from
them -- or has
vision of 20/200
T201.3 Security Architecture and Secure Payment Infrastructure
Test Number Assessment
Objective
(Annex I)
Indicator
(Annex I)
Test description Requirement Steps to execute Expected results Status
110
WP201_T201.3_TES
T1
The end users
(service
consumers and
suppliers) enter
the AoD platform
and access its
services using
different identity
standards.
1. A service
consumer
registers and
enters the AoD
platform using
OAuth2 identity.
He/She
successfully
logins.
2. A Service
consumer
registers and
enters the AoD
platform using an
SAML-compliant
identity. He
successfully
logins.
3. A user tries to
enter the AoD
using wrong
credentials and
he is incapable of
accessing AoD
services.
A security system report
with the registration and
the attempts to enter the
system, together with
their outputs.
Not yet
begun
111
WP201_
T201.3_TEST2
Every service
supplier can
define whether
his/her service
offered through
the AoD is free of
charge, or
associated with
fees and the type
of associated
charging model.
1. A service
supplier defines
that the service
he offers is freely
available.
2. Another
service supplier
defines that his
service is paid on
site (which is the
same as "freely
available" for the
security
subsystem).
3. Another
service supplier
defines that his
service is offered
on pay per time
basis.
Security system registers
the service details and
keeps track of the usage
of all services
(irrespective of the
associated charges).
Not yet
begun
112
WP201_ T201.3_TEST3
Scalability and support
of diverse devices by
the security sub-system
1. A service consumer
enters the AoD using
his laptop.
2. A service consumer
enters the AoD using
his mobile Phone.
3. A service consumer
enters the AoD using
his tablet.
4. High numbers of
users enter the AoD to
check the scalability of
the system.
The security system is expected
to allow users with correct
credentials to enter the AoD
irrespective of the device type.
Not yet
begun
113
WP201_ T201.3_TEST4
Provisioning/deprovisio
ning information
communicated to the
payment subsystem
1. Four different
service suppliers
register an equal
number of services
associated respectively
with the free-of-
charge, one-off
payment, pay per use
and pay per time
model.
2. Ten service
consumer access these
four services (either all
of them or subsets of
them).
3. The payment sub-
system requests from
the security sub-system
to provide the usage
information for these
four services.
4. The security sub-
system replies
providing information
about the usage per
partner.
The payment system is providing
the usage per service and user to
the interested AoD user.
Not yet
begun
114
WP201_ T201.3_TEST5
Support of three
charging models by the
AoD (through the
payment sub-system)
1. Service suppliers
registers multiple
services associated
associated with the pay
per use, one-off and
pay per time charging
model.
2. An end user
purchases four services
of different charging
models.
3. The end user enters
the AoD and check the
usage and charges of
the services he has
purchased.
4. The AoD (after
communicating with
the payment sub-
system) presents him
the usage and current
situations of charges
and payments.
The user (service consumer) is
presented with charging and
payment information about the
services he has purchased.
Not yet
begun
115
WP201_ T201.3_TEST6
Statistics and payment
information for service
suppliers
1. A service supplier has
registered four services
associated with different
charging models.
2. Service consumers
purchase the services
offered by the service
supplier of step 1.
3. The service supplier
enters the AoD to check the
usage and payment details
related to the services he
offers.
4. The service supplier is
offered information about
the usage, the accomplished
payment transactions and
the pending/forthcoming
payments.
The user (service
supplier) is presented
with usage and
payment information
about the services he
offers.
Not yet
begun
WP201_ T201.3_TEST7
Micro-funding process
initiation
1. A service consumer enter
the AoD (succesfull login)
2. The service consumer
enters the user-bid
subsystem and initiates a
micro-funding process
declaring the purpose of the
service to be developed and
details regarding the final
assignment
A new micro-funding
process is initiated and
listed in AoD
Not yet
begun
116
WP201_ T201.3_TEST8
Participation of
potential service
consumers to micro-
funding process
1. Three service consumers
of the AoD register and two
of them declare they are
willing to participate to
micro-funding process while
the third is not.
2. The three service
consumers enter the AoD
and the two of them are
presented with the option
to participate to the micro-
funding process. Each of
them declares hish
willingness to participate
with different amounts of
money.
In the micro-funding
process database, the
contributors'
information is
registered.
Not yet
begun
WP201_ T201.3_TEST9
Participation of
potential service
supplier to micro-
funding process
1. Three service suppliers
enter the AoD and place
three different offers for
developing the service
described in the micro-
funding process.
In the micro-funding
process database, the
proposals of the service
suppliers are
registered.
Not yet
begun
117
WP201_ T201.3_TEST10
User bid closure: The
micro-funding process
ends according to rules
defined in D201.1.
1. Following the previous
tests, both contributions
and service development
proposals have been
registered to the user-bid
system.
2. The micro-funding
process closes according to
the rules defined in D201.1.
3. The selected service
developper is notified.
4. The participating service
consumers are notified
about the process closure
and payment information is
forwarded from the user-bid
system to the payment sub-
system.
1. The involved users
(both service consumer
and service suppliers)
are notified about the
micro-funding process
closure.
2. Relevant payment
orders are
communicated to the
payment sub-system.
Not yet
begun
118
WP201_ T201.3_TEST11
AoD: registration
and profile
management of
users with different
types of disabilities
1. A user (either service
consumer or service
supplier) registers in the P4A
AoD and later on modifies
his profile.
2. A user (either service
consumer or service
supplier) with vision
impairments users a screen
reader to register in the P4A
AoD and later on to modify
his profile.
3. A user (either service
consumer or service
supplier) with motor
impairments uses only the
keyboard to register in the
P4A AoD and later on to
modify his profile.
4. Users with other types of
disabilities register and
manage their profiles.
(Details are provided in the
use case scenarios
document of WP205.)
The user database of
the AoD is filled in and
the user can enjoy the
AoD functionality.
Not yet
begun
119
WP201_ T201.3_TEST12
AoD: Service
registration and
management
1. A service supplier (having
registered in the AoD
platform) enters AoD to
register a new human-based
service paid on a pay-per-
use model
2. A service
supplier/developer (having
registered in the AoD
platform) enters AoD to
register a new machine-
based service charged
following the one-off model
3. A service supplier (having
registered in the AoD
platform) enters AoD to
register a new crowd-
sourced service following
the pay-per-time service.
The service database of
AoD and the service-
payment database of
the payment
subsystem are
populated
Not yet
begun
120
WP201_ T201.3_TEST13
AoD: Listing of
Assistance Services
1. A service consumer
enters the AoD and seeks
for a service providing
specific keywords.
2. The AoD provides a list of
services following a user-
defined criterion.
3. Another service consumer
with vision impairments
seeks for another service
and the AoD responds again
with a list of services.
4. Similarly, service
consumers with different
types of disabilities perform
a service search.
(Details are provided in the
relevant document of
WP205.)
All accessibility
features of AoD
checked
Not yet
begun
121
WP201_ T201.3_TEST14
AoD: Multimodal
Technical Support.
Τhe users of the
AoD platform are
offered technical
support using
different modalities
1. Following the previous
tests, both contributions
and service development
proposals have been
registered to the user-bid
system.
2. The micro-funding
process closes according to
the rules defined in D201.1.
3. The selected service
developper is notified.
4. The participating service
consumers are notified
about the process closure
and payment information is
forwarded from the user-bid
system to the payment sub-
system.
1. The involved users
(both service consumer
and service suppliers)
are notified about the
micro-funding process
closure.
2. Relevant payment
orders are
communicated to the
payment sub-system.
Not yet
begun
122
WP201_ T201.3_TEST15
AoD- Configurable
Assistance on
Demand Service
Network: a carer
sets-up a network
of AoD services for
this relative.
1. An end user seeks for
technical support and is
offered a pallete of different
technical support modes. He
opts for accessing a manual.
2. The same user feels he
has not found the support
he was looking for and
decides to watch a video on
the use of AoD.
3. Another user interested in
getting technical support
opts for a crowd-based
technical support service.
He decides to get help from
a volunteer named Alicia.
4. The previous user decides
to turn to a paid technical
support service.
Existence of technical
support regarding the
use of the AoD
exploiting different
modalities (and
charging schemes).
Not yet
begun
123
WP201_ T201.3_TEST16
AoD: chain of
services based on
QoS criteria
1. An AoD user interested in
setting up an AoD service
network, enters the AoD
and follows guided steps.
2. In each step, the user
(carer) selects one or more
AoD services for covering
the needs of his relative.
3. In the end, an AoD service
network is set-up.
4. Another carer is allowed
to modify this network to
adapt to the changing needs
of the person with
disabilities.
Set up of AoD service
network (relevant
registrations and links
in the service-user
database).
Not yet
begun
WP201_ T201.3_TEST17
AoD: Service search
based on QoS-Cost
trade offs
1. A service consumer
selects a machine -based
service.
2. The user is not satisfied
by the quality of this service
and decides to press the "try
harder" button to seek for a
similar service associated
with higher QoS.
3. The AoD presents him a
list of relevant services
ranked based on QoS
criteria.
Service listing based on
QoS criteria, availability
of "try harder" button.
Not yet
begun
T201.4 Gamifiacation Prototypes and modules
124
Test Number Assessment
Objective
(Annex I)
Indicator
(Annex I)
Test description Requirement Steps to execute Expected results Status
WP201_T201_4_TEST_1
Gamification
Community
Platform: User
authentication
User login, simple
account
management, user
logout
Platform loading and
displaying personalized
output
Not yet
begun
WP201_T201_4_TEST_2
Gamification
Community
Platform: Code
integration
repository commit.
Quests update
accordingly
integration of versioning
system (github, svn, ..).
Automated quest
generation
Not yet
begun
WP201_T201_4_TEST_3
Gamification
Community
Platform:
Crowdsourcing
user login,
detecting similar
users/projects
user finds other users or
projects with similar
properties (statistical)
Not yet
begun
WP202 – Building Block Set
T202.1 Repository of components
Test Number Assessment
Objective
(Annex I)
Indicator
(Annex I)
Test description Requirement Steps to execute Expected results Status
WP202_T202.1_Test1
Searching a
component
1. Fill in predefined
search strings into
search box
Verify that correct result
set is returned and
correctly rendered
Not yet
begun
125
WP202_T202.1_Test2
Browse
1. Select a random
component, note
down categories
listed
2. Navigate to
component again
via category
hierarchy
Component can be found
via a pathes
Not yet
begun
WP202_T202.1_Test3
Add
1. Take an existing
component.
2. Save page_1
3. Delete
component
4. Recreate
component based
on opened page_2
5. Save page_2
Confirm that page_1 and
page_2 are identical
Not yet
begun
WP202_T202.1_Test4
Edit
1. Take an existing
component_1
2. Open new tab
and Select
component_2
3. Edit
component_2 to be
equal to
component_1
Confirm that component_1
is equal to component_2
Not yet
begun
126
T202.2 AT Specific I/O modules
AsTeRICS AT Modules
Test Number Assessment
Objective
(Annex I)
Indicator
(Annex I)
Test description Requirement Steps to execute Expected results Status
WP202_T202.2a_TEST_1 Pairing with HID
actuator
HID-actuator USB dongle
1 PC (controlled device)
1 PC with bluetooth
(controlling device)
1) Plugin HID actuator
dongle at remote
(target-) computer
2) Activate bluetooth
stack at local
(controller-)
computer
3) Add bluetooth
device (Search for
bluetooth devices) on
local computer
3a) Connect to 'Serial
Adapter'
3b) Enter pairing
code '1234'
1) HID actuator announces
itself as 'Serial Adapter', a
serial over bluetooth
adapter (SPP)
2) Pairing with serial adapter
is successful
3) new COM - port in Device
Manager appears (note
COM port number)
Not yet
begun
WP202_T202.2a_TEST_2 Moving mouse
cursor of remote
computer from
along a circular
path
WP202_T202.2a_TEST_1 -
AsTeRICS AT Modules -
Universal HID Actuator
(FHTW)
1) Execute
WP202_T202.2a_TES
T_1 - AsTeRICS AT
Modules - Universal
HID Actuator (FHTW)
the mouse cursor of remote
(target) computer should
start moving along a
rectangular path
Not yet
begun
127
Java JRE (Java JDK for using
the API in own applications)
demo application for HID
actuator
2) Start demo
application for HID
actuator (using
HIDDemo.bat)
Robobraille translator modules
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP202_T202.2 - AT
Specific I/O Modules
Open Source
version of
RoboBraille
Implementation plattform
should be available (5 high-
end MS Windows
computers, third party
software including
Microsoft Office, OCR
solution with batch
processing, text-to-speech
engines, DAISY Pipeline,
SaveAsDaisy, web server,
ftp server, web server, ...).
Actual requirements will
be documented in
implementation manual
1. Verify that it is
possible to assemble
a running system by
following the
implementation
manual. In addition
to the RoboBraille
agent software, a
number of
commercial and open
source technologies
must be sources.
2. Verify that all
conversion paths in
the RoboBraille
conversion matrix
(http://www.robobrai
lle.org/sites/default/f
iles/resourcefiles/rob
obraille_conversion_
(1) A running system can be
assembled by following the
guidelines in the
implementation manual.
(2) All conversions in the
RoboBraille conversion matrix
can be completed.
(3) The solution will run in an
open environment as well as
a stand-alone environment
behind a firewall.
Not yet
begun
128
matrix_1-3.pdf) can
be completed
3. Verify that that the
system can be setup
and operated as a
stand-alone system
behind a firewall.
T202.3 – Generic multimodal interaction modules
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP202_T202.3a_TEST_1 Attaching Arduino
UNO and installing
device driver
Arduino UNO device driver
(COM-Port over USB)
1) Attach Arduino to
USB Port
2) Go to Device
Manager, select
Arduino device and
update driver with
provided driver file
1) After attaching Arduino
UNO, Arduino device with
unsupported driver must
appear in Device Manager
2) After driver installation a
new COM-Port must be added
to the Device Manager
Not yet
begun
WP202_T202.3a_TEST_2 Flashing FW and
establishing
communication
with
microcontroller
Successful test
WP202_T202.3a_TEST_1
flash.bat tool and firmware
- .hex file which holds the
communication firmware
for the Arduino controller
(ArduinoCIM.hex)
1) Perform
WP202_T202.3a_TES
T_1
2) Flash the firmware
using the following
procedure:
a) Press and hold
'Reset' button
b) Start 'flash.bat
Flashing should start, print
out a progress bar and finish
with the message 'successful'
Not yet
begun
129
COMxx' - where xx is
the COM-Port
number for the
Arduino module
which has been
assigned in the device
manager
c) Release 'Reset'
button
3) Unpower and re-
power Arduino UNO
WP202_T202.3a_TEST_3
Receiving continous
data stream from
microcontroller
within demo
application
Successful test
WP202_T202.3a_TEST_2
Java JRE (Java JDK for using
the API in own
applications)
demo application for
Arduino
1) Perform
WP202_T202.3a_TES
T_2
2) Start demo
application (using
ArduinoDemo.bat)
After startup of demo
application continous data
must be printed to the
console - this represents 6
ADC values of the Arduino
Analog/Digital Converter
Not yet
begun
WP202_T202.3b_TEST_
1
Application should
properly integrate
the Haptic/Touch
I/O API developed
in T202.3, in order
to add haptic
feedback in various
user interface
components
1) Calibrate the
haptic device using
the corresponding
software usually
accompanying such a
device , 2)
Programmatically
integrate the
Haptic/Touch I/O
The user should be able to
feel the haptic feedback of
various UI components
through the haptic device
Not yet
begun
130
Module in the target
application and call
the available (by the
module) functions
where needed, in
order to add haptic
feedback to preferred
UI components, 3)
Run the target
application and
follow a simple
scenario to interact
with the user
interface using a
haptic device
WP202_T202.3c.TEST_1 Measure relative
movements of nose
from live vieo feed
using webcam
Computer with Webcam
OpenCV-Libraries
VideoInput-Library
Microsoft Visual Studio
DirectShow (Windows)
Java JRE (Java JDK for using
the API in own
applications)
demo application for
FaceTracker
1) Start demo
application (using
FaceTrackerDemo.bat
) - the demo adds
callback handler for
noseX and noseY
coordinates
2) After a video-
frame window
appears, move head
from left to right
1) As you move your head
from left to right the console
output should have a positive
value for noseX
2) As you move your head
from right to left the console
output should show a
negative value for noseX
3) As you move your head
from up to down the console
output should have a positive
value for noseY
4) As you move your head
from down to up the console
Not yet
begun
131
output should show a
negative value for noseY
T202.4 – Smart device and environment interconnection modules
Smart home integration modules
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP202_T202.4c_TEST_1 Test the operation
of a KNX-smart
home / building
automation
actuator
needed Hardware: KNX
components (minimum
setup is one KNX/IP router
and and one KNX light
actuator and the ETS setup
software)
Java JRE (Java JDK for using
the API in own applications)
demo application for KNX
1) Attach the KNX
light actuator
device to the KNX
bus and the KNX/IP
router to the same
bus and to a local
area network where
also the test
computer is
connected
2) Use the ETS4.0
software to assign
suitable KNX group-
and device
addresses to your
KNX light actuator
3)start the demo
application, use the
IP-adress of the
KNX/IP router and
when everything has been
correctly configured, the light
should continuously turn on
and off in an interval of 5
seconds
Not yet
begun
132
the assigned KNX
adress of the light
actuator (which
consists of group
and device address
seperated by a #
character) as a
parameter
(KNXDemo.bat
<ipadress>
<KNXaddress>)
WP202_T202.4c_TEST_2
Test the operation
of an EnOcean
smart home /
building automation
actuator
needed Hardware:EnOcean
(minimum setup is one
EnOcean USB adapter and
one EnOcean wireless
switch)
Java JRE (Java JDK for using
the API in own applications)
demo application for
EnOcean
1) Attach the
EnOcean bridge to a
USB port and install
the necessary
drivers (provided)
2) Configure the
EnOcean switch for
a desired address
3) start the demo
application, use the
adress of the
EnOcean switch as a
parameter
(EnOceanDemo.bat
<switchAdress> )
when everything has been
correctly configured, pressing
the switch shall result in the
console ouptut "switch has
been pressed"
Not yet
begun
133
WP202_T202.4c_TEST_3
Test the operation
of an FS20 smart
home / building
automation
actuator
needed Hardware: one FS20
USB dongle, one FS20 230V
actuator
Java JRE (Java JDK for using
the API in own applications)
demo application for FS20
1) Attach the FS20
dongle a USB port
2) Plug in the FS20
power plug actuator
into a 230V wall
outlet
3) press the button
on the FS20 power
plug actuator until
the LED blinks
4) start the demo
application
(FS20Demo.bat) -
this will
automatically assign
the FS20 adress
1111111_1111 to
the power plug
actuator
when everything has been
correctly configured, the led
of the FS20 plug should
continuously turn on and off
in an inverval of 5 seconds -
indicating that power is
switched on / off in that
interval
Not yet
begun
T202.5 – Real time user monitoring
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP202_T202.5b_TEST_
1
Test the operation
of a continuous
reception of raw
biosignal data from
an OpenEEG ("P2"-
needed Hardware:
OpenEEG compatible
biosignal amplifier (e.g.
ModularEEG or SMT-EEG
by Olimex)
1) attach an
OpenEEG / P2
compatible biosignal
acquisition device
(e.g. the SMT EEG by
a continous flow of data (6
channel values) should appear
on the console window (the
values represent the averaged
Not yet
begun
134
protocol)
compatible device
Java JRE (Java JDK for using
the API in own
applications)
demo application for
OpenEEG P2 packet format
data reception
(P2Demo.bat)
Olimex) to an USB
port
2) Install the
provided device
driver and verify that
a COM Port has been
created in the
Windows device
manager - note the
COM port number
3) start the demo
application and use
the COM port as an
argument
(P2Demo.bat
<COMxx>)
RMS values of the raw data
values)
WP202_T202.5b_TEST_
2
Test the operation
of a continuous
reception of raw
biosignal data from
an OpenBCI
compatible device
needed Hardware:
OpenBCI biosignal amplifier
Java JRE (Java JDK for using
the API in own
applications)
demo application for
OpenBCI data sream
reception
(OpenBCIDemo.bat)
1) attach an OpenBCI
bluetooth dongle to
a USB port and
power up the
OpenBCI board
2) Install the
provided device
driver and verify that
a COM Port has been
created in the
Windows device
manager - note the
COM port number
a continous flow of data (8
channel values) should appear
on the console window (the
values represent the averaged
RMS values of the raw data
values)
Not yet
begun
135
3) start the demo
application and use
the COM port as an
argument
(OpenBCIDemo.bat
<COMxx>)
T202.6 – Web based smart personalization and interface adaptation modules
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP202_T202.6
TEST_1
Validate
automated test
coverage
Run all existing
automated (i.e.
unit and
acceptance) tests
in the latest
version of te
following
browsers: Chrome,
Firefox, Safari, and
Internet Explorer
on Windows and
Mac OS X
All tests should pass, unless
otherwise documented.
Not yet
begun
WP202_T202.6
TEST_2
Extend
automated test
coverage with
Based on analysis
of new features
implemented this
year, identify and
Test suite coverage will be
expanded, reducing the
time and effort required to
Not yet
begun
136
new or refined UI
automation tests
implement
automated UI
acceptance tests
(using a technology
such as
PhantomJS)
do successive technical
validation tasks.
WP203 – Collaborative development tools/environments
T203.1 – Development tools for adaptive interfaces for mainstream applications
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP203_T203.1_TEST_1 Developer tool
should create an
web app/site with
adaptive
components
compatible with the
runtime
environment based
on the model the
user created
Following steps
have to be
performed by the
user:
1) Open the
application
2) create a new
project
3) create states
4) connect states
via transitions
5) define interaction
possibilities for each
state
6) define data
aquisition functions
for the states
7) deploy and start
adaptive web app/site Not yet
begun
137
the adaptive user
interface in the
runtime
environment
WP203_T203.1_TEST_2 Adding pattern set
Following steps
have to be
performed by the
user:
1) Add new patterns
in the defined
manner
2) Refresh
development
environment
3) View list of
patterns available
for modeling
The new patterns are
available for modeling
Not yet
begun
T203.2 – AT Configuration environment
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP203_T203.2_TEST_01 WebACS creates
TPC/IP connection
to ARE
User actions: Click on
the "Connect to ARE"
button in the Ribbon
menu "System".
The ACS establishes a TCP/IP
connection to the ARE and
data can be exchanged
between the two software
components and the
connection status changes to
"Connected". If there is no
ARE reachable in the network
Not yet
begun
138
a meaningful error message
should be displayed. If the
connection status is
"Connected" this button must
be greyed out.
WP203_T203.2_TEST_02
Disconnect from
ARE
User actions: Click
on the "Disconnect
from ARE" button in
the Ribbon menu
"System"
The ACS closes the TCP/IP
connection to the ARE. The
connection status changes to
Disconnected. If the
connection status is
"Disconnected" this button
must be greyed out.
Not yet
begun
WP203_T203.2_TEST_03
Upload model
User actions: Click
on the "Upload
model" button in the
Ribbon menu
"System"
The ACS sends the model as
an xml string to the ARE over
the previously established
TCP/IP connection. If the
connection Status is
"Disconnected" this button is
greyed out.
Not yet
begun
WP203_T203.2_TEST_04
Download model
User actions: Click on
the "Download
Model" button in the
Ribbon menu
"System".
The ACS downloads the model
from the ARE as an XML String
and displays the model in the
Model Designer canvas. If the
model contains errors (does
not match the xml schema) a
meaningful error message is
shown. If the connection
Not yet
begun
139
status is "Disconnected" this
button must be greyed out.
WP203_T203.2_TEST_05
Download
component
collection
User actions: Click on
the "Download
component
collection" button in
the Ribbon menu
"System"
The ACS downloads all
available components/plugins
in xml format from the
connected ARE and updates
the drop down lists for
sensors, actuators and
processors. If the received
data does not match the
components xml schema a
meaningful error message is
shown. This button must be
greyed out when the
connection status is
"Disconnected"
Not yet
begun
WP203_T203.2_TEST_06
Set model as
autorun
User actions: Click on
the "Set model as
autorun" button in
the Ribbon menu
"System"
The ACS transmits the model
visible in the "Model
Designer" canvas to the ARE in
xml file format. The ARE
stores this model in the
filesystem under
"ARE/models/autostart.acs".
The next time the ARE gets
started this model starts
automatically. If during the
sending process the
connection interrupts, a
meaningful error message is
shown. If the connection
Not yet
begun
140
status is "Disconnected" this
button is greyed out.
WP203_T203.2_TEST_07
Start model
User action: Click on
the "Start model"
button in the Ribbon
menu "System"
The ACS transmits the "start
model" command to the ARE.
The model should start in the
ARE. This button must be
greyed out when the
connection status is
"Disconnected" or no model
has been uploaded to the ARE
yet.
Not yet
begun
WP203_T203.2_TEST_08
Pause model
User action: Click on
the "Pause model"
button in the Ribbon
menu "System"
The ACS transmits the "pause
model" command to the ARE.
The model should pause in
the ARE. This button must be
greyed out when the
connection status is
"Disconnected" or no model
has been started on to the
ARE yet.
Not yet
begun
WP203_T203.2_TEST_09
Stop model
User action: Click on
the "Stop model"
button in the Ribbon
menu "System"
The ACS transmits the "stop
model" command to the ARE.
The model should stop in the
ARE. This button must be
greyed out when the
connection status is
"Disconnected" or no model
has been started or paused on
to the ARE yet.
Not yet
begun
141
WP203_T203.2_TEST_10
Create new model
User actions: Click on
the "New model"
button in the Ribbon
menu "System"
If the "Model Designer"
contains unsaved changes a
dialog pops up asking the user
if he wants to store the
modified model. If the user
answers the dialog with yes,
the model gets overridden if it
was already stored before in
the filesystem. Otherwise a
new file chooser dialog pops
up where the user has to
select the target file path.
Then the model gets stored in
the chosen file-path. After
that, or when the user
pressed "No" in the "Store
modifications" dialog the
canvas is cleared, and a new
model gets initialized in
memory. This functionality is
disabled if the connection
status of the ACS is
"Connected" and the model
that is loaded at the moment
is running on the ARE.
WP203_T203.2_TEST_11
Open model
User actions: Click on
the "Open model"
button in the Ribbon
menu "System"
If the "Model Designer"
contains unsaved changes a
dialog pops up asking the user
if he wants to store the
modified model. If the user
Not yet
begun
142
answers the dialog with yes,
the model gets overridden if it
was already stored before in
the filesystem. Otherwise a
new file chooser dialog pops
up where the user has to
select the target file path.
Then the model gets stored in
the chosen file-path. After
that, another file chooser
dialog pops up where the user
can select the model which
should be loaded. After the
user selected a model file the
"Model Desginer" canvas is
cleared, and the selected
model gets loaded into
memory. If the model to open
does not confirm the ACS
model xml schema a
meaningful error message
gets displayed.This
functionality is disabled if the
connection status of the ACS
is "Connected" and when the
model that is visible in the
"Model designer" canvas is
running on the ARE at the
moment.
143
WP203_T203.2_TEST_12
Save model
User actions: Two
possibilities two
trigger this
functionality. Click
on the "Save model"
button in the Ribbon
menu "System" or
press Ctrl-S
If the model displayed in the
"Model Designer" has been
previously stored in a file, this
file will be overriden.
Otherwise a file chooser
dialog pops up, where the
user can choose the target file
to store the model. The file
chooser dialog can also be
canceled which also cancels
the whole save operation.
WP203_T203.2_TEST_13
Save model as
User actions: Click on
the "Save model as"
button in the Ribbon
menu "System"
A file chooser dialog pops up
where the user can select the
target file to store the model
displayed in the "Model
Designer". When the user
selects a target file, the xml
representationof the model
data gets stored to the
selected location. The user
also has the possibility to
cancel the dialog which means
that the model will not be
stored to a file.
Not yet
begun
WP203_T203.2_TEST_14
Edit model
description
User actions: Click on
the "Edit model
Description" button
in the Ribbon menu
"Edit".
A dialog pops up containing
an editable textarea where
the user can modify the model
description. If the model
already contains a description
Not yet
begun
144
its value is shown in the
textarea when the dialog
appears.
WP203_T203.2_TEST_15
Move component
User actions: Two
possibilities: Drag a
component with the
mouse or select it via
the keyboard and
move it around with
the movement
keyboard keys e.g.
cursor keys.
The selected component
(keyboard focus/mouse click)
moves around following the
the mouse movements/key
presses. All connected
elements like eventchannels
and datachannels also update
their position.
Not yet
begun
WP203_T203.2_TEST_16
Copy component
User actions: Two
possibilities: Select
one or more
components/channel
s/event channels
with the mouse or
the keyboard. Then
press Ctrl-C or use
the button "Copy" in
the Ribbon menu
"Edit"
The selected
components/data
channels/event channels are
stored in the local "copy
buffer" and can be pasted
later on. Previously stored
data in the "copy buffer" gets
overridden.
Not yet
begun
WP203_T203.2_TEST_17
Cut component
User actions: Two
possibilities: Select
one or more
components/channel
s/event channels
with the mouse or
The selected
components/data
channels/event channels are
deleted from the displayed
model in the "Model
Designer" canvas, get stored
Not yet
begun
145
the keyboard. Then
press Ctrl-X or use
the button "Cut" in
the Ribbon menu
"Edit"
in the local "copy buffer" and
can be pasted later on.
Previously stored data in the
"copy buffer" gets overridden.
WP203_T203.2_TEST_18
Paste component
User actions: Two
possibilities: Press
Ctrl-v or use the
button "Paste" in the
Ribbon menu "Edit"
The elements previously
stored in the "copy buffer" by
a copy or cut action are
pasted into the model visible
in the "Model designer"
canvas. If a pasted element
has the same id as an existing
component in the displayed
model, it gets refactored to be
unique within the new model.
This functionality is disabled if
the copy buffer is empty.
Not yet
begun
WP203_T203.2_TEST_19
Delete selected
components
User actions: Two
possibilities: Select
one or more
components/channel
s/event channels
with the mouse or
the keyboard. Then
press the "Del" key
or use the button
"Delete" in the
Ribbon menu "Edit"
The selected elements get
deleted from the model and
are no longer visible in the
"Model designer".
Not yet
begun
146
WP203_T203.2_TEST_20
Undo
User actions: Two
possibilities: Press
Ctrl-z or use the
"Undo" button in the
Ribbon Menu
"System"
The last action like move
element, delete element, add
element, change property of
element gets reverted in the
displayed model of the
"Model Designer" canvas. The
action object gets popped
from the undo stack and the
undo-action object gets
pushed on the redo stack.
Not yet
begun
WP203_T203.2_TEST_21
Redo
User actions: Two
possibilities: Press
Ctrl-y or use the
"Redo" button in the
Ribbon Menu
"System
The action which got
previously reverted by the
undo function is again applied
on the displayed model in the
"Model Designer" canvas. The
applied action object gets
popped from the redo stack
and pushed on the undo
stack. This functionality is
deactivated when no undo
operations was performed by
the user while editing the
displayed model in the
"Model Designer" canvas
Not yet
begun
T203.3 – Runtime environment
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
147
WP203_T203.3_TEST_1
Connect to the ARE
via the REST interface
Proceed with the
following actions:
1) Open the Web-ACS
application
2) Click Connect to ARE
Application displays that is
connected to ARE.
Not yet
begun
WP203_T203.3_TEST_2
Load a model from
the repository.
Having opened the
Web-ACS proceed with
the following actions:
1) Click Load model
from Repository
2) Click OK
Model is loaded to the canvas of
the Web-ACS
Not yet
begun
WP203_T203.3_TEST_3
Deploy a model to the
ARE.
Having loaded the
model proceed with the
following actions:
1) Click Deploy model
Model is deployed and the start
button on Web-ACS is enabled.
Not yet
begun
WP203_T203.3_TEST_4
Start the model
deployed on the ARE.
Having deployed the
model proceed with the
following actions:
1) Click Start model
Model is started and the
application is actively running.
Not yet
begun
WP203_T203.3_TEST_5
Pause the model
deployed on the ARE.
Having started the
model proceed with the
following actions:
1) Click Pause model
Model is paused and the
application execution is paused.
Not yet
begun
148
WP203_T203.3_TEST_6
Resume the model
deployed on the ARE.
Having paused the
model proceed with the
following actions:
1) Click Start model
Model is re-started and the
application is actively running.
Not yet
begun
WP203_T203.3_TEST_7
Stop the model
deployed on the ARE.
Having started the
model proceed with the
following actions:
1) Click Stop model
Model is re-started and the
application is actively running.
Not yet
begun
WP203_T203.3_TEST_8
Save a model created
using the Web-ACS.
Proceed with the
following actions:
1) Open the Web-ACS
application
2) Create the model
3) Click Save model Model is saved in the repository.
Not yet
begun
T203.4 – Guidelines and frameworks for low cognitive and stepping stone applications for low digital literace
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP203_D203.1_TEST_1
Functionality
Reliability
Usability
Suitability
Accuracy
Maturity
Fault
tolerance
Understanda
bility
It is possible to
great a predfined
app using a
selection of
predefined
components
connected in a
specified way. The
Follow detailed
instructions The app functions correctly.
Pass
149
Operability app functions
correctly.
WP203_D203.1_TEST_2
Specific features
design for coga11y
function correctly
Not yet
begun
WP203_D203.1_TEST_3
Functionality
Reliability
Usability
Suitability
Accuracy
Maturity
Fault
tolerance
Understanda
bility
Operability
Detailed tests of
each feature &
capability listed
Follow detailed
instructions The app functions correctly.
PASS
WP204 – Media and material automated/crowdsourced transformation infrastructures
T204.1 – T204.2 – T204.5 – T204.6 – Extended crowd-corrected captioning platform
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP204_T204.1_modular
ization_of_code__docu
mentation_and_develop
ment
- Conversion,
modularization,
and documentation
of PCF's Amara
media
transformation
1) Visit the amara
code repository at:
https://github.com/p
culture/unisubs
2) Inspect (or follow
the instructions) in
1) Amara's code repository is
open and accessible.
2) README contains
instructions for setting up a
development environment to
contribute to the project.
Not yet
begun
150
tools -
Development of a
Media
Transformation
Infrastructure that
can support
crowdsourced
services and
maintenance
improvements.
Facilitation of open
development of
new or better
modules for the
Materials
Transformer
Infrastructure/Tool
Kit.
the README for
getting a
development
environment up and
running.
3) Click the Issue
button and view the
list of issues in the
issue tracker.
3) List of issues contain clear
tasks that need to be
performed in order to
improve the amara codebase.
Beginner issues are tagged as
bite-sized to indicate they are
simpler tasks good for a new
contributor.
WP204_T204.1_syncing
with_additional_platfor
ms__inspection
Build compatibility
to sync with
additional media
hosting platforms
1) View the AMARA
API documentation at
http://universal-
subtitles.readthedocs
.org/en/latest/api.ht
ml
The use of the amara API is
documented and accessible
for hosting platform
integration by pushing videos
and pulling completed
subtitles.
Not yet
begun
WP204_T204.1_syncing
with_additional_platfor
ms__execution
Build compatibility
to sync with
additional media
hosting platforms
1) Visit the P4All test
team on amara.org
and check the tab for
Settings >
Integrations
2) Choose a youtube
1) Team is configured to sync
subtitles to Youtube,
Brightcove and Kaltura
2) Completed youtube
subtitles are sycned to the
youtube video.
Not yet
begun
151
on the team, add
subtitles and mark as
complete.
3) Choose a
brightcove video on
the team, add
subtitles and mark as
complete
4) Choose a Kaltura
video on the team,
add subtitles and
mark as complete
3) Completed brightcove
subtitles are synced to the
brightcove video
4) Completed Kaltura subtitles
are synced to the Kaltura
video
WP204_T204.1_editor_c
ompatibility
Build compatibility
with Amara editor
for additional
media hosting
platforms and
media types
1) Add 1 video of the
following types to
amara: Brightcove,
mp3, ogg, vimeo,
youtube, mp4
2) On the video page
of each video, click to
create subtitles
1) Videos can be added to the
amara platform
2) Videos are opened in the
editor, playback and subtitles
can be typed, synced and
saved.
Not yet
begun
WP204_T204.1_display_
widget_media_types
Build compatibility
with Amara display
widget for
additional media
hosting platforms
and media types
1) For each of the
following video types:
Brightcove, ogg, mp3,
vimeo, youtube,
wistia , follow the
instructions
(https://github.com/p
culture/unisubs/wiki/
Embed-Code-Usage-
Guide) to create an
1) Video types can be
displayed in the offsite
embedder. Subtitles and a
transcript can be displayed in
each embed.
Not yet
begun
152
html page with the
video embedded.
WP204_T204.2_simple_
captioning__collaborati
on
Modification of
subtitle editor to
support real-time
collaboration (and
near real-time
captions)
1) Open the editor
and add a few
subtitles to the video,
add a note, click Save
draft and exit.
2) As a second user,
open the editor for
the same video
language.
3.) In the editor
session, add a new
note, and save a
subtitle draft.
4) As the original
editor, open the same
video / language
1) User can edit subtitles and
add notes.
2) Second user can edit same
subtitles and view notes left
by original subtitler
3) Second subtitle can add
notes, edit and save subtitle
versions.
4) Original subtitler sees notes
and changes made by
previous subtitler.
Not yet
begun
WP204_D204.2_simple_
captioning__editor_locki
ng
Addition of user
presence to subtitle
editor and collision
avoidance, in order
to furnish low-
friction
collaboration
1) Open the editor for
a video / language
and leave it open.
2) In a second
browser, log in as a
different user and
attempt to open the
same video /
language for editing.
1) First user can edit the
subtitles.
2) Second user is displayed a
message the the subtitles are
currently locked.
Not yet
begun
153
WP204_D204.2_simple_
captioning__revision_co
mparisons
Creation of more
distinguished user
presence in
history/diff
comparison system
1) Open the video
language of a video in
amara the has a few
revisions.
2) Verify the
information displayed
on the revision list.
1) Video language page has a
revision tab that shows all
revisions.
2) Revisions list contains the
revision number, date, source
and user. User is a link to the
User's Profile page on amara.
Not yet
begun
WP204_D204.2_simple_
captioning__reference_l
anguage_display
Modification of
caption timing
editor to compare
two differently
timed subtitle
languages side-by-
side, while in edit
mode
1) Open the video
page of a video in
amara that has
subtitles in a few
different languages.
2) Click Add a new
language to start
subtitling in a
different language.
3) Verify the amara
editor displays the
reference langauge to
the left of the source
languages.
1) List of languages is
displayed on the video page.
2) Language is selected and
the amara editor opens.
3) Reference languages are
displayed to the left of the
working language. User can
lock / unlock the scrolling to
modify alignment of the
subtitle lines. Subtitle start
times for each line are
displayed.
Not yet
begun
154
WP204_D204.2_simple_
captioning__reference_l
anguage_switching
Modification of
data model to
support dynamic
many-to-many
relationships
between subtitle
languages
1) Open the video
page of a video in
amara that has
subtitles in a few
different languages.
2) Start playback
2) Click Add a new
language to start
subtitling in a
different language.
3) Choose a new
language / version
from the reference
langauge menu to the
left of the working
langauge.
1) List of languages is
displayed on the video page.
2) Language is selected and
the amara editor opens.
3) Reference languages are
selectable in the interface.
User can choose from any
available language. User can
choose to copy timings from
the reference language as a
source for the working
language.
Not yet
begun
WP204_T204.3_reponsiv
e_editor_ui
Interfaces to allow
captioning by users
and communities
that primarily use
phones and tablets.
Using a mobile device
or simulator with a
minimum resolution
of 1080p open the
amara editor.
2) Start playback
3) Type subtitles
4) Sycing timings
5) Submit subtitles.
Amara editor displays
correctly at this resolution.
2) Buttons on the UI play /
pause the video
3) User can type subtitles
4) User can sycn timing using
the UI buttons
5) User can save or submit
subtitles.
Not yet
begun
WP204_T204.3_reponsiv
e_submit_videos
Interfaces to allow
captioning by users
and communities
1) Using a mobile
device or simulator
with a minimum
resolution of 1080p,
1) User can submit a video to
amara using a mobile device.
Not yet
begun
155
that primarily use
phones and tablets.
navigate to the
subtitle video page
and add a video url.
WP204_T204.3_reponsiv
e_view_captioned
videos
Interfaces to allow
captioning by users
and communities
that primarily use
phones and tablets.
1) Using a mobile
device or simulator
with a minimum
resolution of 1080p,
navigate to the
subtitle video page
and add a video url.
1) User can submit a video to
amara using a mobile device.
Not yet
begun
WP204_T204.3_reponsiv
e_submit_videos
Interfaces to allow
captioning by users
and communities
that primarily use
phones and tablets.
1) Using a mobile
device or simulator
with a minimum
resolution of 1080p
open the amara
editor, navigate a
video page. 1) User view videos on amara
Not yet
begun
WP204_T204.3_watch_v
ideo_with_captions
Interfaces to allow
captioning by users
and communities
that primarily use
phones and tablets.
1) Using a mobile
device or simulator
with a minimum
resolution of 1080p,
open a video page,
choose a language in
the embedded player
and start playback.
1) User can watch a video
with captions on a mobile
device
Not yet
begun
WP204_T204.3_view_ca
ptioned_revisions
Interfaces to allow
captioning by users
and communities
1) Using a mobile
device or simulator
with a minimum
resolution of 1080p,
1) User can watch a video
with captions on a mobile
device
Not yet
begun
156
that primarily use
phones and tablets.
open a video
language page.
WP204_T204.3_watch_v
ideo_view_transcript
Interfaces to allow
captioning by users
and communities
that primarily use
phones and tablets.
1) Using a mobile
device or simulator
with a minimum
resolution of 1080p,
open a video page
and click the
Transcript button on
the embedded video
1) User can view and search
through the content of the
trascript.
Not yet
begun
WP_204_T204.5_media_
enhanced_documents__
video
Access to Media
Enhanced
Documents
1) Sensus can check
for existing captions
on amara for videos
or audio urls
embedded in
media_enhanced_do
cuments
1) Sensus has an api
integration with amara to
retrieve captions for audio
and video urls.
Not yet
begun
WP205 – Assistance on Demand Services Infrastructures
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP205_D205.3_TEST_1
Service providers/end-
users can connect to
people with similar
interests and make new
friends
1. A service provider/end-user
uses the search functionality in
order to find other
providers/end-users with
specific interests
2. He/she sends a friend
request to a selected
provider/end-user
A new friend relationship
is established and from
now on the two friends
can chat, send messages
and/or subscribe to each
other's news feed
PASS
157
3. The targeted provider/end-
user is notified that there is a
new friend request for him/her
4. The targeted provider/end-
user accepts the friend request
WP205_D205.3_TEST_2
Service providers/end-
users can chat with their
friends
1. A service provider/end-user
selects a friend from his/her
friend list and clicks on the
corresponding button to start
chatting
2. After the chat window is
opened, he/she writes a phrase
and presses ENTER
The targeted friend is
immediatelly notified in
his/her profile that the
specific user tries to start
chatting with him/her.
The phrase sent by the
friend is shown and the
current user is able to
reply directly.
PASS
WP205_D205.3_TEST_3
Service providers/end-
users can be automatically
informed when an event
related to their interests
happens
1. A service provider offering
service A defines in his/her
profile that he/she wants to be
notified when an end-user
located in the same city has
defined that he/she is
interested at service A
2. A corresponding end-user
defines in his/her profile that
he/she is interested to find a
provider offering service A
The service provider is
informed that the
corresponding nd-user is
interested at his/her
services
PASS
1. An end-user interested at
service A defines in his/her
profile that he/she wants to be
notified when a provider
located in the same city has
defined that he/she offers
service A
The end-user is informed
that the corresponding
provider offers the
desired services
PASS
158
2. A corresponding service-
provider defines in his/her
profile that he/she offers
service A
WP205_D205.3_TEST_4
Service providers/end-
users can participate to
various user groups of their
interest or even create new
user groups and invite
friends to join, in order to
support crowd-based
services
1. A service provider/end-user
searches for a user group of
his/her interest
2. Then, he/she clicks on the
corresponding button in order
to join the group
The service
provider/end-user is
then subscribed to user
group's news feed and
he/she is also able to
make new posts to the
group
PASS
1. A service provider/end-user
clicks on the corresponding
button in order to create a new
user group
2. He/she puts a name and a
description for this user group
and also defines in it will be
private (members only by
invitation) or public (visible to
all and anyone can join)
The new user group is
created and is ready to
accept new members
PASS
WP205_D205.3_TEST_5
Service providers/end-
users can organize events
and meetings
1. A service provider/end-user
clicks on the corresponding
button in order to create a new
event
2. He/she selects the type of
the event (teleconference,
face-to-face meeting, etc.)
3. He/she puts a description
and a location (if applicable) for
the event
4. He/she defines that it will be
The new event is created
and all the invited users
are notified in their
profiles for this event
and for the
corresponding invitation
PASS
159
private (participation only by
invitation)
5. He/she selects a set of
his/her friends or members of a
specific user group to send an
invitation for this event
6. He/she completes the event
creation by clicking on the
corresponding button
1. A service provider/end-user
clicks on the corresponding
button in order to create a new
event
2. He/she selects the type of
the event (teleconference,
face-to-face meeting, etc.)
3. He/she puts a description
and a location (if applicable) for
the event
4. He/she defines if it will be
public (visible to all and anyone
can participate)
5. He/she completes the event
creation by clicking on the
correponding button
The new event is
created, it is visible by all
and anyone can
participate
PASS
WP206 - Sustainable Meaningful Consumers-Developer Connections (Pull vs Push)
T206.2: Creation of feedforward mechanisms for directing future development efforts
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
160
WP206_D206.2_TEST_01
Users can register
in the application
using a custom
accessing system
Users are not registered
in the application and
then:
1) Open the application
2) Go to “New user”
3) Introduce a new
username and a
password.
4) Click on “Create new
user”
5) If the username is not
used, the new user is
created
The new user is
created
Not yet
begun
WP206_D206.2_TEST_02
Users can log in
the application
using a custom
accessing system
Users are not logged in
the application, but they
have an user account and
then:
1) Open the application
2) Go to “Log in”
3) Introduce a valid
username and password.
4) Click on “Log in”
The user logs in the
application
Not yet
begun
WP206_D206.2_TEST_03
Users can login the
application using a
Facebook account
Users are not logged in
the application and then:
1) Open the application
2) Go to “Log in using
The user logs in the
application using
the Facebook
account
Not yet
begun
161
Facebook”
3) Introduce a valid
Facebook username and
password.
4) Click on “Log in”
WP206_D206.2_TEST_04
Users can login the
application using a
Google+ account
Users are not logged in
the application and then:
1) Open the application
2) Go to “Log in using
Google+”
3) Introduce a valid
Google+ username and
password.
4) Click on “Log in”
The user logs in the
application using
the Google+ account
Not yet
begun
WP206_D206.2_TEST_05
Change language
Users log in the
application and then:
1) Click on a “Spanish”
link at the top of the
application.
2) Click on a “English” link
at the top of the
application.
The language of the
application interface
changes to the
selected language
Not yet
begun
WP206_D206.2_TEST_06
Users can create
new product
proposals. Both
End Users and
Users log in the
application and then:
1) Click on “New product
proposal”
A new product is
created. The
product must be
approved by a
Not yet
begun
162
Developers can
create them
2) Introduce the data of
the new proposed
product using a web
form.
3) Optionally, create one
or more characteristics of
the product, using
another web form
4) Click on “Create
product”
Developer before
appearing at the
public list of
products
WP206_D206.2_TEST_07
Users can add new
proposals to
existing products
Users log in the
application and then:
1) Click on “List of
proposed products”
2) Browse the list.
3) Click on the name of a
product to see a full
description of the
product.
4) Click on the “New
attached proposal”
button.
5) Fill the form with the
information about the
new proposal
The new proposal
must be approved
by a Developer.
Then, it will appear
as a new
characteristic of the
product
Not yet
begun
163
WP206_D206.2_TEST_08
Users can add a
vote to a product
or product’s
characteristic
Users log in the
application and then:
1) Click on “List of
proposed products”
2) Browse the list.
3) Users can vote clicking
on a “Vote” button next
to a product.
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can vote for
the product, clicking on a
“Vote” button.
b. Products can have a list
of characteristics. Next to
each characteristic there
is a “Vote Characteristic”
button, so users can vote
them also
As users vote
products and
product’s
characteristics, the
number of votes are
shown on the lists
Not yet
begun
WP206_D206.2_TEST_09
Users can add
comments to
products or
Users log in the
application and then:
1) Click on “List of
proposed products”
The comment must
be approved by a
Developer before
being published.
Not yet
begun
164
product’s
characteristics
2) Browse the list.
3) Users can add a
comment clicking on a
“Comment” button next
to a product.
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can add a
comment to a product by
clicking on a “Comment”
button.
b. Products can have a list
of characteristics. Next to
each characteristic there
is a “Comment
Characteristic” button, so
users can comment them
also.
Then, any user can
read it.
WP206_D206.2_TEST_10
Share a product or
product’s
characteristic on a
social network
Users log in the
application and then:
1) Click on “List of
proposed products”
2) Browse the list.
Users must be
logged in a
Facebook/Google+
account to share a
product or product’s
Not yet
begun
165
3) Users can share a
product clicking on the
“Share with [Facebook |
Google+]” buttons next to
a product.
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can share the
product, clicking on the
“Share with [Facebook |
Google+]” buttons.
b. Products can have a list
of characteristics. Next to
each characteristic there
are “Share characteristics
with [Facebook |
Google+]” buttons, so
users can share specific
characteristics.
characteristic on
that social network.
After logging in, a
caption of the
product or product’s
characteristic is
published on the
user’s timeline at
the chosen social
network
WP206_D206.2_TEST_11
Users can donate
money to a
product or
characteristic
Users log in the
application and then:
1) Click on “List of
proposed products”
Users have to use a
paypal account to
donate money. A
connection is
Not yet
begun
166
2) Browse the list.
3) Users can donate
money clicking on a
“Donate” button next to
a product.
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can donate
money to the product,
clicking on a “Donate”
button.
b. Products can have a list
of characteristics. Next to
each characteristic there
is a “Donate to
Characteristic” button, so
users can donate money
to a specific
characteristic.
created with Paypal
to get the funds.
WP206_D206.2_TEST_12
Developers can
create surveys and
attach them to a
product or
Developer log in the
application and then:
1) Click on “Create
survey”
A new survey is
created and all
Users can fill them.
Results are stored in
Not yet
begun
167
product’s
characteristic
2) Use a form to create
one or more questions.
a. There are many
different types of
question (Yes/No, Open
answer, Multiple
predefined answers…).
3) Click on “Attach survey
to product”
4) A list of products
appears and the
Developer chooses one or
more products.
5) Optionally, attach the
survey to one or more
specific characteristics of
a product.
6) Click on “Create”
button.
the database and
can be consulted by
Developers
WP206_D206.2_TEST_13
Users can fill a
survey that has
been attached to a
product or
product’s
characteristic
Users log in the
application and then:
1) Click on “List of
proposed products”
2) Browse the list.
3) Click on the name of a
product to see a full
description of the
The results of the
survey are stored on
the database
Not yet
begun
168
product.
4) Access to the list of
surveys associated with
the product.
5) Click on a survey
6) Optionally, click on a
survey associated only
with a specific
characteristic.
7) Answer the questions
associated with the
survey.
8) Click on “Send” button.
T206.3 – Creation of Consumer Participatory R&D Mechanisms
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP206_D206.3_TEST_01
Developers can
create a prototype
of the product,
and attach it to a
current product
description, so
users can
download and test
it
Developer log in the
application and then:
1) Click on “Create
prototype”.
2) Fill a form where the
prototype is described.
a. A working installation
program must be
included.
b. Optionally, installation
The prototype is
attached to a
product. Users can
download this
prototype to test it
Not yet
begun
169
instructions can be
included.
c. Optionally, a
description of the
prototype can be
included.
3) Click on “Assign
prototype” to product.
4) Choose the product
the prototype is
associated to.
5) Click on “Assign”.
WP206_D206.3_TEST_02
Users can vote or
comment on a
prototype
Users log in the
application and then:
1) Click on “List of
proposed products”
2) Browse the list.
3) Click on the name of a
product to see a full
description of the
product.
a. On the product’s list of
prototypes, users can
vote for a prototype by
clicking on the “Vote”
button next to a
prototype.
The vote or
comment are added
to the prototype
Not yet
begun
170
b. On the product’s list of
prototypes, users can
comment on a prototype
by clicking on the
“Comment” button next
to a prototype and filling
a form. Comments must
be approved by a
Developer.
WP206_D206.3_TEST_03
Users can donate
money to a
prototype
Users log in the
application and then:
1) Click on “List of
proposed products”
2) Browse the list.
3) Users can donate
money clicking on a
“Donate” button next to
a product.
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can donate
money to the product,
clicking on a “Donate”
Users have to use a
paypal account to
donate money. A
connection is
created with Paypal
to get the funds.
Not yet
begun
171
button.
b. Products can have a list
of characteristics. Next to
each characteristic there
is a “Donate to
Characteristic” button, so
users can donate money
to a specific
characteristic.
WP206_D206.3_TEST_04
Developer can
attach a survey to
a prototype
Developer log in the
application and then:
1) Click on “Create
survey”
2) Use a form to create
one or more questions.
a. There are many
different types of
question (Yes/No, Open
answer, Multiple
predefined answers…).
3) Click on “Attach survey
to prototype”
4) A list of products
appears and the
Developer chooses one or
more products.
5) A list of the product’s
The survey is
attached to the
prototype and users
can answer their
questions
Not yet
begun
172
prototypes appears and
the Developer chooses
one or more prototypes.
6) Click on “Create”
button.
WP206_D206.3_TEST_05
Users can fill a
survey that has
been attached to a
prototype
Users log in the
application and then:
1) Click on “List of
proposed products”
2) Browse the list.
3) Click on the name of a
product to see a full
description of the
product.
4) Access to the list of
prototypes associated
with the product.
5) Access to the list of
surveys associated with
the prototype.
6) Click on a survey
7) Answer the questions
associated with the
survey.
8) Click on “Send” button.
The results of the
survey are stored on
the database
Not yet
begun
173
WP206_D206.3_TEST_06
Developer can
attach an
automated test to
a prototype
Users log in the
application and then:
1) Click on “Create
automated test”
2) Use a form to create
one or more steps.
a. There are many
different types of steps
that can be performed.
3) Click on “Attach test to
prototype”
4) A list of products
appears and the
Developer chooses one or
more products.
5) A list of the product’s
prototypes appears and
the Developer chooses
one or more prototypes.
6) Click on “Create”
button.
The test is
associated with the
prototype. Users
can execute the
automated test, and
the results are
automatically stored
on the database
Not yet
begun
WP206_D206.3_TEST_07
Users can execute
an automated test
attached to a
prototype
Users log in the
application and then:
1) Click on “List of
proposed products”
2) Browse the list.
3) Click on the name of a
The results of the
automated test are
stored on the
database
Not yet
begun
174
product to see a full
description of the
product.
4) Access to the list of
prototypes associated
with the product.
5) Access to the list of
automated tests
associated with the
prototype.
6) Click on an automated
test
7) Follow the instructions
on screen to execute the
automated test.
T206.4 - Creation of Feedback and FeedPeer systems
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP206_D206.4_TEST_01
Developers can
create new
product proposals
Users log in the
application and then:
1) Click on “List of
products”
2) Browse the list.
3) Click on the name of a
product to see a full
description of the
A new product is
created
Not yet
begun
175
product.
4) Click on the “New
attached proposal”
button.
5) Fill the form with the
information about the
new proposal
WP206_D206.4_TEST_02
Users can add new
proposals to
existing products
Users log in the
application and then:
1) Click on “List of
products”
2) Browse the list.
3) Click on the name of a
product to see a full
description of the
product.
4) Click on the “New
attached proposal”
button.
5) Fill the form with the
information about the
new proposal.
The new proposal
must be approved
by a developer.
Then, it will appear
as a new proposed
characteristic of the
product
Not yet
begun
WP206_D206.4_TEST_03
Users can add a
vote to a product
or product’s
characteristic
Users log in the
application and then:
1) Click on “List of
products”
2) Browse the list.
As users vote
products and
product’s
characteristics, the
Not yet
begun
176
3) Users can vote clicking
on a “Vote” button next
to a product.
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can vote for
the product, clicking on a
“Vote” button.
b. Products can have a list
of characteristics. Next to
each characteristic there
is a “Vote Characteristic”
button, so users can vote
them also.
number of votes are
shown on the lists
WP206_D206.4_TEST_04
Users can add
comments to
products or
product’s
characteristics
Users log in the
application and then:
1) Click on “List of
products”
2) Browse the list.
3) Users can add a
comment clicking on a
“Comment” button next
to a product.
The comment must
be approved by a
Developer before
being published.
Then, any user can
read it
Not yet
begun
177
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can add a
comment to a product by
clicking on a “Comment”
button.
b. Products can have a list
of characteristics. Next to
each characteristic there
is a “Comment
Characteristic” button, so
users can comment them
also.
WP206_D206.4_TEST_05
Share a product or
product’s
characteristic on a
social network
Users log in the
application and then:
1) Click on “List of
products”
2) Browse the list.
3) Users can share a
product clicking on the
“Share with [Facebook |
Google+]” buttons next to
a product.
Users must be
logged in a
Facebook/Google+
account to share a
product or product’s
characteristic on
that social network.
After logging in, a
caption of the
product or product’s
Not yet
begun
178
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can share the
product, clicking on the
“Share with [Facebook |
Google+]” buttons.
b. Products can have a list
of characteristics. Next to
each characteristic there
are “Share characteristics
with [Facebook |
Google+]” buttons, so
users can share specific
characteristics.
characteristic is
published on the
user’s timeline at
the chosen social
network
WP206_D206.4_TEST_06
Users can donate
money to a
product or
characteristic
Users log in the
application and then:
1) Click on “List of
products”
2) Browse the list.
3) Users can donate
money clicking on a
“Donate” button next to
a product.
Users have to use a
paypal account to
donate money. A
connection is
created with Paypal
to get the funds
Not yet
begun
179
4) Optionally, users can
click on the name of a
product to see a full
description of the
product.
a. On the description
page, users can donate
money to the product,
clicking on a “Donate”
button.
b. Products can have a list
of characteristics. Next to
each characteristic there
is a “Donate to
Characteristic” button, so
users can donate money
to a specific
characteristic.
WP206_D206.4_TEST_07
Developers can
create surveys and
attach them to a
product or
product’s
characteristic
Developer log in the
application and then:
1) Click on “Create
survey”
2) Use a form to create
one or more questions.
a. There are many
different types of
question (Yes/No, Open
A new survey is
created and all
Users can fill them.
Results are stored in
the database and
can be consulted by
Developers
Not yet
begun
180
answer, Multiple
predefined answers…).
3) Click on “Attach survey
to product”
4) A list of products
appears and the
Developer chooses one or
more products.
5) Optionally, attach the
survey to one or more
specific characteristics of
a product.
6) Click on “Create”
button.
181
WP206_D206.4_TEST_08
Users can fill a
survey that has
been attached to a
product or
product’s
characteristic
Users log in the
application and then:
1) Click on “List of
products”
2) Browse the list.
3) Click on the name of a
product to see a full
description of the
product.
4) Access to the list of
surveys associated with
the product.
5) Click on a survey
6) Optionally, click on a
survey associated only
with a specific
characteristic.
7) Answer the questions
associated with the
survey.
8) Click on “Send” button.
The results of the
survey are stored on
the database
Not yet
begun
Table 3: List of SP2 test scenarios
9.2 SP3 Test Scenarios
182
WP301 – Communication, daily living, health and accessible mobility
T301.1 – Learning and training s/w applications
Face tracker camera input module integration
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP301_D301.2_TEST_01
Integration of
FaceTracker camera
input module (FHTW)
into FlashWords
softwareprototype.
1) Start FlashWords
application
2) Move the head in order
to control the mouse cursor
The mouse cursor
moves according to
head movements.
Not yet
begun
T301.2 – Improving access to technology for dementia sufferers/carers
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP301_D301.3_TEST_1
Navigation is coorect -
all screenscan be
reached correct
Run the application without
GPI
1) Attempt to reach each
screen and navigate back.
All screens can be
reached correctly
Not yet
begun
WP301_D301.3_TEST_2
Each player works
correctly - will be
broken down into mre
detailed tests
Run the application without
GPI and a predefined media
set.
1) Test each player All players function
Not yet
begun
183
WP301_D301.3_TEST_3
When a user logs into
the GPII requested
settings must change to
those requested by the
user preferences.
Eg High contrast
throughout, speech
voice and rate.
Run the application without
GPII and check settings are
correct. Run GPII.
1) Log in a GPII user
2) confirm settings
3) logout user
4) ensure settings revert.
Not yet
begun
WP301_D301.3_TEST_4
When a user logs out
settings must revert.
After previous test
1) Log out GPII user
2) confirm settings
Not yet
begun
T301.3 – Public Access Points to ICT
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP301_T301.3_TEST_1
The application will
control the pc with head
movements
Not yet
begun
T301.4 – Pluggable user interfaces for home appliances, home entertainment and home services
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
184
WP301_T301.4_TEST_1
Template URC Sockets:
Abstract user interface
descriptions for
common devices in
households (e.g., HVAC,
TV, Entertainment).
URC sockets are used to
build concrete device-
and user-specific
interfaces. The template
sockets contain the
basic abstract user
interface information,
that most of the
devices, within a group
e.g.; HVAC devices, have
in common. Therefore,
the template sockets
can be used as a
foundation by
developers to build
concrete user interfaces
for a many devices in
households.
Not yet
begun
WP301_T301.4_TEST_2
Simple user interfaces
for older people and
people with mild
cognitive disabilities.
We will provide simple web
user interfaces for the
template URC sockets.
185
WP301_T301.4_TEST_3
Proof-of-Concept
implementation of the
template sockets and
the simple user
interfaces in the URCLab
(Smart-Home/AAL)
laboratory at the HdM
We will install real devices
(of common devices in
households [e.g., HVAC, TV,
Entertainment]) in our
laboratory (AAL-Lab) at the
HdM and use them as a test-
bed for the template
sockets and the simple user
interfaces
T301.6 – Routing Guidance System
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP301_D301.6_TEST_01
Correct voice
recognition for
dysarthric speech
(Parkinson's disease)
1) Open application
2) Got to settings
3) Select ASR
4) Select voice speed
level
5) Click "OK/Apply"
1) Voice command
accepted 2)
Request performed
Not yet
begun
WP301_D301.6_TEST_02
POIs (Service-on-
demand) are
captioned (SP204
mechanism) for older
and hearing impaired
users
1) Open application
2) Go to settings
3) Select POIs
3) Select captions
4) Select "On"
4) Click "OK/Apply"
1) Captions are
activated for
certain POIs
2) Captions are
presented on time
3) Captures are
right
Not yet
begun
186
WP301_D301.6_TEST_03
Stress detection
whilst in traffic
1) Open application
2) Go to settings
3) Select "Connect to USB
device" (or app?)
3) Connect to stress
recording device (or app
?)
4) Get "Measurement
was successful" message
5) Display measurement
and/or result
(Stressed/Not stressed)
6) Prompt "Save
measurement"
7) Push "Save" or
"Discard" button
1) External device
USB Connection
succesfully
established (if not
app)
2) Start
measurement
3) Stop
measurement
4) Display
measurement
6) Display result
("Stress/Not
stressed")
5) Save or discard
measurement
6) Measurement is
saved or
measurement is
discarded
Not yet
begun
WP302 - Education eLearning, Business and Employment
T302.1 – Accessible BI
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
187
WP302_BI_TEST1
User will log on the
business intelligence
solution using P4All
authentication
infrastructure. The sytem
will recognize the need for
adaptive tecnologies and
redirect the user to an
assitive technologies
interface.
1. login as a user in need of
assistive technology
1. the system will
directly open the
accessible version of
the console.
2. After login the user
will be provided with a
list of accessible
business intelligence
reports
3. User can adapt the
visualization so better
suit his/her visual
impediments.
Not yet
begun
WP302_BI_TEST2
User will visualize a list of
available report supporting
assistive technologies
Use selects a BI report 1. A report supporting
assistive technologies
will be displayed to the
user
Not yet
begun
WP302_BI_TEST3
Geo referenced Business
Intelligence report access User selects a georeferenced BI
report
1. Georeferenced BI is
displayed
2. the user activates the
visual impairment
adaptation
3. The data represented
in the cartographic
report, will be
described in textual and
numerical format
4. user can adapt
graphical properties to
better suit his/her
visual impairment
5. user can access the
Not yet
begun
188
information activating
audio support to get
the information in the
report
6. user can interact
with braille support to
access the information
represented in the
Business Intelligence
report.
T302.2 - Counselling and printing services
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP302-T302.2 - Counselling
and printing services -
Registration
Register for the web service
Register for a "print service"
account. Including name,
address, institute, payment,
etc.
Wait for approval
and get your
account.
Not yet
begun
WP302-T302.2 - Counselling
and printing services -
Ordering
Ordering tactile material
1) Login at the print service
website.
2) Go to "New order".
3) Upload your SVG file and
add some comments.
4) Exit the website.
File will be manually
checked and if
something is not
clear, someone will
get back to you.
As soon as possible
you will receive your
tactile material.
Not yet
begun
T302.3 - Enhanced with P4A accessibility of learning material
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
189
WP302-T302.3 -
Accessibility of learning
material - Making tactile
Material
Making tactile graphics
eLearning or real training on
making tactile graphics.
User will be able to
make graphics
himself.
Not yet
begun
WP302-T302.3 -
Accessibility of learning
material - Conversion tools
Tool to convert online
LaTeX files to an
accessible easy readable
HTML/xHTML result.
1) Login at the online tool
website.
2) Upload your TeX-files
and press "Convert"
The result will be a
download link to
the converted
HTML files. There
wont be a 100%
conversion due to
the case that most
LaTeX documents
contain selfmade
commands and
functions that can't
be converted
automatically.
Not yet
begun
T302.4 - Enhanced with P4A special educational programmes and learning tools
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP302_T302.4_TEST1
Integration of open
source Input
Transducer
prototyping module
(FHTW) into
1) Start FlashWords application
2) Activate the sensor in order
to controll the scanning layer of
the UI
The scanning UI can be
controlled with the
sensor.
Not yet
begun
190
FlashWords software
prototype.
T302.5 – FLOE integrated with P4A
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP302_T302.5_TEST2
Validate automated
test coverage
Prior to major project
milestones such as pilots,
all unit and acceptance
tests will be run in
supported browsers. The
anticipated list may
include the latest
versions of Chrome,
Firefox, Internet Explore
and Safari on Mac and
Windows
All tests should
pass.
Not yet
begun
WP302_T302.5_TEST2
Perform manual QA tests based on documented test plans
Each Floe component will
include a set of
automated and manual
QA test plans that should
be followed to verify the
components
functionality.
Manual test
activities should
produce passing
results.
Not yet
begun
191
WP303 – Assistance on Demand Service
T303.1 - Consumer assistance on demand system
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
T303.1: Consumer
Assistance on Demand
system
James uses the screen
reader software in order
to operate with his
computer at work.
There is a problem with
this some of the work
he is revising since some
graphs are not
explained and labels as
they should. In order to
ask for help, James first
uses Prosperity4all
bugging report in order
to record the problem
and forward it in to a
college. This bug is
noticed by the company
and someone from the
“crowd” or from work
records and explanation
of the graph.
Not yet
begun
T303.2 – Business Assistance on Demand system
192
T303.3 - Enhancing existing technical AOD services
Test Number Assessment
Objective
Indicator Test description Requirement Steps to execute Expected results Status
WP303_T303.3_TEST1
Integration of the
AoD module into the
help system of the
FlashWords
softwareprototype.
1) Start FlashWords application
2) Open the help dialog
3) Press the "help from LIFEtool
button"
The problem of the user
is solved.
Table 4: List of SP3 test scenarios
193
1 0 ANNEX IΙ: Assessment objectives and indicators
1. Functionality
The capability of the software product to provide functions which meet stated and implied needs
when the software is used under specified conditions.
a. Suitability
The capability of the software product to provide an appropriate set of functions for specified tasks
and user objectives.
b. Accuracy
The capability of the software product to provide the right or agreed results or effects with the
needed degree of precision.
c. Interoperability
The capability of the software product to interact with one or more specified systems.
d. Security
The capability of the software product to protect information and data so that unauthorised
persons or systems cannot read or modify them and authorised persons or systems are not denied
access to them.
e. Functionality compliance
The capability of the software product to adhere to standards, conventions or regulations in laws
and similar prescriptions relating to functionality.
2. Reliability
The capability of the software product to maintain a specified level of performance when used
under specified conditions.
a. Maturity
The capability of the software product to avoid failure as a result of faults in the software.
b. Fault tolerance
The capability of the software product to maintain a specified level of performance in cases of
software faults or of infringement of its specified interface.
c. Recoverability
The capability of the software product to re-establish a specified level of performance and recover
the data directly affected in the case of a failure.
d. Reliability compliance
The capability of the software product to adhere to standards, conventions or regulations relating
to reliability.
3. Usability
The capability of the software product to be understood, learned, used and attractive to the user,
when used under specified conditions.
a. Understandability
194
The capability of the software product to enable the user to understand whether the software is
suitable, and how it can be used for particular tasks and conditions of use.
b. Learnability
The capability of the software product to enable the user to learn its application.
c. Operability
The capability of the software product to enable the user to operate and control it.
d. Attractiveness
The capability of the software product to be attractive to the user.
e. Usability compliance
The capability of the software product to adhere to standards, conventions, style guides or
regulations relating to usability.
4. Efficiency
The capability of the software product to provide appropriate performance, relative to the amount
of resources used, under stated conditions.
a. Time behavior
The capability of the software product to provide appropriate response and processing times and
throughput rates when performing its function, under stated conditions.
b. Resource utilization
The capability of the software product to use appropriate amounts and types of resources when
the software performs its function under stated conditions.
c. Efficiency compliance
The capability of the software product to adhere to standards or conventions relating to efficiency.
5. Maintainability
The capability of the software product to be modified. Modifications may include corrections,
improvements or adaptation of the software to changes in environment, and in requirements and
functional specifications.
a. Analyzability
The capability of the software product to be diagnosed for deficiencies or causes of failures in the
software, or for the parts to be modified to be identified.
b. Changeability
The capability of the software product to enable a specified modification to be implemented.
c. Stability
The capability of the software product to avoid unexpected effects from modifications of the
software.
d. Testability
195
The capability of the software product to enable modified software to be validated.
e. Maintainability compliance
The capability of the software product to adhere to standards or conventions relating to
maintainability.
6. Portability
The capability of the software product to be transferred from one environment to another.
a. Adaptability
The capability of the software product to be adapted for different specified environments
without applying actions or means other than those provided for this purpose for the software
considered.
b. Installability
The capability of the software product to be installed in a specified environment.
c. Co-existence
The capability of the software product to co-exist with other independent software in a
common environment sharing common resources.
d. Replaceability
The capability of the software product to be used in place of another specified software
product for the same purpose in the same environment.
e. Portability compliance
The capability of the software product to adhere to standards or conventions relating to
portability.
196
1 1 ANNEX III: Data Collection File
11.1 Sheet 1: Workpackage Information
Deliverable Number
Deliverable Title
Work package #:
Title:
Description:
Source:
Interface type:
Responsible team/partner:
Contact:
Table 5: Workpackage Information
11.2 Sheet 2: Features, Functionalities & Capabilities of Tool/Service under technical validation
Title Description Technologies Final User
Table 6: Features and Capabilities
197
11.3 Sheet 3: Test Scenarios of Tool/Service under technical validation
Number Description Requirement Link Steps to execute
Expected Results Status
WP201_D201.1_TEST_1 Application should export the user data in an XML format XXX
With the users populated in the system proceed with the following actions: 1) Open the application 2) Go to Tools > Export 3) Click on "Select export location" 4) Select a path 5) Click on "Export"
A XML file containing all the system users information should be placed in the selected directory.
PASS
Table 7: Test Scenarios
11.4 Sheet 4: Responsible Teams
List of deliverables (SP2)
198
WP Task Number Deliverable Name (tool or service) Leader Contact person
201
T201.1 Developer Space IDRC Colin Clark ([email protected])
T201.2 Unified Listing RTF-I Gregg Vanderheiden ([email protected]); Eva de Lera ([email protected])
T201.2 Open Marketplace for developers RTF-I Gregg Vanderheiden ([email protected]); Eva de Lera ([email protected])
T201.2 & T201.3 Payment infrastructure, micropayment and use bid systems, incl. security modules SILO Gianna Tsakou ([email protected])
T201.4 Gamification Prototypes & modules HdM Andreas Stiegler ([email protected])
202
T202.1 Repository of components KIT Till Riedel ([email protected])
T202.2 AsTeRICS AT Modules FHTW Chris Veigl ([email protected])
T202.2 Robobraille Translator modules Sensus Lars Ballieu Christensen ([email protected])
T202.3 Generic multimodal interaction modules FHTW Chris Veigl ([email protected])
T202.4 Smart Device and Environment Interconnection modules KIT Till Riedel ([email protected])
T202.5 Real time user monitoring modules CERTH Konstantinos Votis ([email protected])
T202.6 Web-based smart personalisation and interface adaptation modules IDRC Colin Clark ([email protected])
203
T203.1 Development tools for adaptive interfaces for mainstream applications FhG Matthias Peissner ([email protected])
T203.2 AT configuration environment KII Stefan Parker ([email protected])
T203.3 Runtime environment (UCY ok but missing HdM and FhG) UCY Achilleas Achilleos ([email protected])
T203.4
Guidelines and frameworks for low cognitive and stepping stone pplications for low digital literace OD Steve Lee ([email protected])
199
204
T204.1, T204.2, T204.5, T204.6
Extended crowd-corrected captioning platform PCF
Janet Dragojevic ([email protected]) / Dean Jansen ([email protected])
T204.3, 204.4, T204.5, T204.6 Extended RoboBraille engine Sensus Lars Ballieu Christensen ([email protected])
205 T205.1 - T205.5
Asstistance on Demand services and functionalities SILO Gianna Tsakou ([email protected])
206 T206.1 - T206.5
Consummer-Developer connecting modules and services TECH Víctor Manuel Hernández Ingelmo ([email protected])
List of deliverables (SP3)
WP Task Number Deliverable Name (tool or service) Leader Contact person
301
T301.1 Enhanced with P4A learning and training s/w applications Lifetool Stefan Schuerz ([email protected])
T301.2 Enahnced with P4A application for dementia sufferers OD Steve Lee ([email protected])
T301.3 Enahnced with P4A public access points to ICT Guada Alberto Corpas ([email protected])
T301.4 Enhanced with P4A pluggable Uis for home environments HdM Lukas Smirek ([email protected])
T301.5 Enhanced with P4A game-based cognitive rehabilitation SILO Gianna Tsakou ([email protected])
T301.6 Enhanced with P4A routing guidance system CERTH Kostas Kalogirou ([email protected]), Katerina Touliou ([email protected])
302
T302.1 Enhanced with P4A accessible BI ENG Marco Aimar ([email protected])
T302.2 Enhanced with P4A counselling and printing services KIT-SZS Thorsten Schwarz ([email protected]); Till Riedel ([email protected])
T302.3 Enhanced with P4A accessibility of learning material KIT-SZS Thorsten Schwarz ([email protected]); Till Riedel ([email protected])
T302.4 Enhanced with P4A special edcuational programmes and learning tools Lifetool Stefan Schuerz ([email protected])
T302.5 FLOE integrated with P4A IDRC Colin Clark ([email protected])
303
T303.1 Cunsumer assistance on demand system TECH Víctor Manuel Hernández Ingelmo ([email protected])
T303.2 Enhanced with P4A business AoD TECH Víctor Manuel Hernández Ingelmo ([email protected])
T303.3 Enhanced with P4A technical AoD Lifetool Stefan Schuerz ([email protected])
Table 8: Responsible teams