- This week, we begin with an article presenting a set of data governance principles that should enable consumers to take control of their data.
- Next, is a piece about a potential data breach in Chinese short-form video app TikTok involving personal user data of up to 2 billion user records.
- Then, we have a note on the impact of malware pollution on a data lake.
- Following that, we have an analysis on how poor quality of data directly impacts business performance.
- Next, is an essay describing the key features of data virtualization solutions.
- Lastly, we have a list of seven data quality best practices that can be implemented to improve data performance for an organization.
Give People Control Of Their Data
It takes me less than 15 seconds to hit “I Agree” to the standard terms and conditions whenever I download an app. Whether I am impatient or just oblivious to the value of the data that I am giving away, I am not alone in this act. All over the world, billions of consumers and businesses are doing the same–giving away their data to be used by other agents for a slew of unintended purposes.
TikTok Hacked, Over 2 Bn User Database Records Stolen: Security Researchers
Cyber-security researchers on Monday discovered a potential data breach in Chinese short-form video app TikTok, allegedly involving up to 2 billion user database records. Several cyber-security analysts tweeted about the discovery of what was “a breach of an insecure server that allowed access to TikTok’s storage, which they believe contained personal user data”.
What’s Polluting Your Data Lake?
A data lake is a large system of files and unstructured data collected from many, untrusted sources, stored and dispensed for business services, and is susceptible to malware pollution. As enterprises continue to produce, collect, and store more data, there is greater potential for costly cyber risks. Every time you send an email or text you are producing data.
How Poor Quality Of Data Directly Impacts Business Performance?
A Forrester study found that roughly 30% of analysts spend 40% of their time confirming and analysing their data before using it for algorithmic implementations and strategic decision-making. These stats demonstrate the severity of the data quality issue, and it’s not a simple issue either. Take the example of the healthcare sector, where poor data quality can make errors in problem diagnosis or treatment recommendations for urgent conditions.
What Are the Key Features Of Data Virtualization Solutions?
Data virtualization is an important tool for businesses of all sizes. By consolidating data into a virtual layer, businesses can improve the performance and efficiency of their data-driven applications while reducing the complexity and cost of data management. In this article, we’ll explore the key features of data virtualization solutions to help you choose the right tool for your business.
7 Data Quality Best Practices To Improve Data Performance
Data quality is essential for any analysis or business intelligence. Employing best practices lets organizations address issues that become even more critical and challenging as teams build a data analytics pipeline. Subtle problems can get magnified by improved automation and increased data aggregation.

Source: https://mailchi.mp/zigram/data-asset-weekly-dispatch_12_september_1