Data validation for big live data

  • Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.

Download full text files

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Name:Laux, Friedrich
URN:urn:nbn:de:bsz:rt2-opus4-14530
URL:https://www.thinkmind.org/index.php?view=instance&instance=DBKDA+2017
Erschienen in:DBKDA 2017 : the Ninth International Conference on Advances in Databases, Knowledge, and Data Applications : GraphSM 2017, the Fourth International Workshop on Large-Scale Graph Storage and Management
Document Type:Conference Proceeding
Language:English
Year of Publication:2017
Tag:ETags; data validation; rowversion validation; virtual data integration
Pagenumber:7
First Page:30
Last Page:36
Dewey Decimal Classification:004 Informatik
Open Access:Ja
Licence (German):License Logo  Creative Commons - Namensnennung, nicht kommerziell, keine Bearbeitung