VWdata

VWdata

VWdata

VWdata

The VWData research programme is the first concrete result of the Dutch National Research Agenda route “ Value Creation through Responsible Access to and use of Big Data”. VWData is funded from the Startimpuls programme of the Dutch National Research Agenda.

VWData Research programme (Dutch National Research Agenda Startimpuls)

Big Data plays a major role in our society, in business and in science. A growing number of choices and decisions are based on the analysis of data collected. However, it is often not clear who can and may have access to the data, exactly how analysis methods work and how accurate these are, and how Big Data can be deployed in a way that is legally and ethically responsible and also acceptable to society. The thematic programme VWData provides a boost to scientific research and to technical and societal solutions for handling Big Data responsibly. VWData is funded from the Startimpuls programme of the Dutch National Research Agenda.    

Approach

The VWData research programme consists of eight strongly connected research projects. Each project concentrates on specific aspects of the challenge “responsible access to and use of Big Data”. The terms FACT and FAIR data science play a key role in the programme as overarching concepts. FACT refers to responsible data science with respect to Fairness, Accuracy, Confidentially and Transparency. FAIR refers to the properties that research data should have if it is to be used in an optimal manner: Findable, Accessible, Interoperable, Re-usable. 

The research activities within each project focus on a specific societal, economic or scientific use case. The programme has been put together in such a way that there is value creation in three domains: information services/media, healthcare and security. This means that a concomitant boost is given to the knowledge and innovation system within the broad area of data science as well as to the domains selected. During a period of two years various symposia will also be organised for researchers, business and politics. In addition, publications will appear with the most important insights as well as recommendations for the future. Overarching activities will be coordinated by the Netherlands eScience Center, TNO and the Netherlands Organisation for Scientific research (NWO). The eight projects are briefly described below.   

  • FairNews: News provision in a Big Data era

Contact person: Claes de Vreese (University of Amsterdam)

Equal access to news is a prerequisite for a healthy democracy. Data analytics and personalised recommendations make it possible to preselect news on the basis of individual user profiles and ‘social sorting’. This project will investigate to what extent algorithms can and may be used in the filtering of information with a view to “fairness”. If access to information is not equal, then this can have major consequences for the freedom of expression and non-discrimination. This research will also contribute to the ability to explain why algorithms exhibit certain behaviour, in this case transparency. The results will lead to a fair recommender system that will make the bias of news recommendations visible. In this project, computer scientists, communication researchers and information law researchers from the University of Amsterdam and Delft University of Technology will work together with de Volkskrant newspaper on demonstrations and tests with news users in order to stimulate the debate about the honest provision of news. The findings from this project will also contribute to a large extent to the programme-wide understanding of what we mean by responsible access to and use of Big Data. 

  • Capturing Bias: Diversity-aware Computation for Accurate Big Media Data Analysis   

Contact person: Lora Aroyo (VU Amsterdam)

This project focuses on achieving reliable and explainable Big Data analysis of media files. In particular, metrics will be developed which indicate how accurate the results of the Big Data analysis are. To this end, crowdsourcing of citizens' opinions will be used, for example. Whereas in the previous project the emphasis is on achieving “fairness”, here the emphasis is on quantifying the degree of “accuracy”, the relationship to other values such as diversity, and subsequently visualising the resulting complexity of decision-making in the media analysis. In the project, computer scientists, media researchers and governance experts from VU Amsterdam, Utrecht University, Leiden University and Delft University of Technology will work together with the Netherlands Institute for Sound and Vision on use cases and technical demonstrations. In doing so, this project not only seeks to achieve scientific process but also to provide an action perspective for new types of presentation and communication about the results of Big Data analysis techniques. 

  • Enabling of privacy-friendly analysis of network data and beyond

Contact person: Joeri de Ruiter (Radboud University)

Big Data analysis is currently characterised not just by the enormous computer power required but also the broadband network connections between computer and the cloud. This network traffic contains a lot of privacy-sensitive information, from the terms used in search engines to personal medical information. Recent research into pseudonimisation has made major progress thanks to the approach of Polymorphic Encryption and Pseudonimisation (PEP). Thanks to PEP, data cannot be simply combined in the event of a leak, while the permitted sharing of data is still possible. In this project, computer scientists from Radboud University and University of Twente will work together with SURF to further develop this technique so that large quantities of data can be pseudonomised, shared and analysed at high speed in a privacy-friendly manner. The results of this project are technical in nature and can be viewed as a building block for the last two projects (see later) while the other projects described can provide ethical, legal and societal input for this project in order to achieve the desired degree of privacy and pseudonimisation. 

  • Responsible Collection and Analysis of Personal Data in Law Enforcement

Contact person: Marc Steen (TNO)

This project investigates how law enforcement organisations such as the police can collect data about suspects and other citizens in a transparent and ethical manner without putting the effectiveness of the detection methods under (too much) pressure: “with and for society”.  

  • Data-driven service innovation: compliance and transparency 'by design'

Contact person: Marlies van Steenbergen (Utrecht University of Applied Sciences)

To achieve FAIR and FACT, suppliers of data-driven services must satisfy the rules about data protection, and, in particular, the concrete requirements of transparency of algorithmic decision-making in the General Data Protection Regulation (GDPR). This project will follow the value-sensitive design approach as a development method for data-driven service innovation. Compared to the other – more scientific – projects in VWData, this project focuses on the tangible operationalisation of the legal requirements in one or more concrete data-driven services that can serve as use cases in the other projects. In this manner, the researchers from Fontys, Utrecht and Zuyd universities of applied sciences and Open Universiteit will contribute to making the research results tangible and this will provide new connections with universities and other knowledge institutions.  

  • Explainable and Secure AI       
  •    
    Contact person: Stephan Raaijmakers (TNO)

Modern, but also traditional, AI solutions are vulnerable for subtle external disruptions. For example, deep neural networks can be misled by input manipulations that are not detectable by humans. This plays a role, for example, in image processing AI, such as that present in driverless vehicles. In these applications, minuscule noise injections in inputs can lead to the incorrect recognition of traffic signs with all the consequences that entails. In addition, machine-learning models can be read remotely through eliciting input/output behaviour, or can be manipulated externally, which poses a real danger because a growing amount of AI is becoming available as a (cloud) service. Making AI secure is therefore a subject that is becoming increasingly important on the research agenda of tech companies such as Google, Facebook and Tesla, as well as in academia. Measures to guarantee the privacy of AI ensures the security of users of AI systems. However, both types of security (for the AI system and for the privacy of users) add extra opacity to AI systems, via the - sometimes destructive and non-reversible - encryption of data or algorithms. At the same time there are increasing calls within the AI world for explainability, for example in the context of operator-intensive applications such as defence. Also the provision of personal data for the current 'human-aware' AI calls for explainable systems in order to increase the willingness to share such information with AI systems. This project will investigate the effects on explainability of making AI secure, and as such will focus on 'explainable, secure AI'. In addition, it will address the explainability of attacks on AI.  

  • Distributed FAIR information systems to enable federated learning and reasoning 

Contact person: Cees de Laat (University of Amsterdam)

Data sources from different owners increase considerably in value if they can be combined. However, the data is often too diverse or vulnerable to be published. Consequently, there is a strong increase in interest in federative solutions for Big Data analysis. In this project, computer scientists from the University of Amsterdam, VU Amsterdam, Leiden University and TNO will work together with a series of users (such is life scientists in GO-FAIR and ASTRON) on developing the architecture for a network of FAIR data hubs and services. The resulting architecture will offer learning and reasoning processes with complete transparency, and anticipate heterogeneous data with various degrees of reliability. The project will provide service demonstrations based on a federation of at least three different data hubs. This project will draw on knowledge about privacy, fairness and accuracy from the other projects.  

  • Analysing partitioned FAIR health data responsibly 

Contact person: Michel Dumontier (Maastricht University)

Whereas in the previous project the emphasis was on the architecture of federative solutions, this project concentrates on the federative learning network, and on the ethical, legal and societal aspects of this. The project focuses on the federative use of all data in the “Maastricht study” (10,000 citizens) and related data from Statistics Netherlands to realise an understanding of the relationships between diabetes, lifestyle and socioeconomic factors. The project, which will be carried out by computer scientists, medical practitioners and ethicists from Maastricht University, ties in with the national “Personal Health Train” initiative. Thanks to synergy in the areas of algorithmic, legal and ethical issues, this offers the other projects the opportunity to build new connections within the medical sector. At the same time, lessons will be learnt in this project from use cases in other domains such as security and media and information services.    

The Consortium members include:

Erasmus University Rotterdam, Leiden University Medical Centre, Open Universiteit, Radboud University, Delft University of Technology, Eindhoven University of Technology, UMC Maastricht, Leiden University, Maastricht University, University of Twente, Utrecht University, University of Amsterdam, VU Amsterdam, Wageningen University and Research Centre, Institute for Information Law (IViR), Data Science Center Eindhoven (DSC/e), Netherlands eScience Center, SURF, TNO, Fontys University of Applied Sciences, Utrecht University of Applied Sciences, Amsterdam University of Applied Sciences, Zuyd University of Applied Sciences.    

Contact:

Figurehead Dutch National Research Agenda Route Creating value through responsible access to and use of Big Data': Prof. R.L. (Inald) Lagendijk, Delft University of Technology.

Contact

Prof. R.L. (Inald) Lagendijk
Figurehead VWData

Top Tweet twitter

Sociale innovatie, creatieve industrie en : Het advies ‘Vanzelfsprekend Duurzaam’ van de…
RT : De Dutch Blockchain Coalition bouwt aan een veilige en betrouwbare digitale blockchain infrastructuur. Onderstaande inve…
Bij verandering door denkt men vaak aan toekomstige . Maar verandering door digitaliserin…
RT : Secretaris Elly van den Heuvel verzorgt gastles cybersecurity
Schrijf je nu in voor de Innovation Expo 2018!
lanceert op eerste dag Dutch Technology Week. Coordinator HCA ICT ziet link oplos…
RT : Want to share your knowledge with the international cyber security community during the One Conference 2018 this October? You…
RT : organiseert vandaag dag van de digitale overheid: 500 ambtenaren aan de slag met en : Ka…
organiseert vandaag dag van de digitale overheid: 500 ambtenaren aan de slag met en
Een krachtige, nieuwe impuls! Regionaal investeringsfonds mbo: 7 nieuwe toekenningen -
RT : Save the date! 15 november is de 21e editie van het . Het verhaal van digitaal: beweging & verbinding. Mel…
Morgen is het zover!
Laatste kans! U heeft nog tot en met morgen om u aan te melden voor de werkbijeenkomst Big Data & Gezondheid. Lees…
RT : Ambassadeur van de Dutch Blockchain Coalition heeft zojuist in Amsterdam de eerste nationale ond…
TechBarometer 2018 : tekort technici schaadt concurrentiepositie Nederland. Lees hier meer:
Nederlandse wetenschappers en het bedrijfsleven bundelen sinds 25 april in Amsterdam hun krachten op het gebied van…
Philips-topman Hans de Jong: gebrek aan techneuten bedreigt welvaart -
Aankondiging call Regional Energy Systems (ERA-Net RegSys). Lees hier meer: .
RT : De One Conference - de jaarlijkse internationale cybersecurity conferentie - zal op 2 & 3 oktober plaatsvinden in Den Haa…