Monitoring complex, dynamic phenomena,
e.g., movement of financial markets and traffic in communication infrastructures, produces highly detailed, evolving
data streams. Glitches in the monitoring and data delivery infrastructure can lead to a variety of quality problems
in the streams, including lost, late or out-of-order data delivery, and erroneous observations. Since data streams
are often continuously analyzed to enable real-time, data-driven decision making, it is imperative to continuously
measure and alert on the quality of the data streams as well. This tutorial presents (i) an end-to-end framework
for data quality measurement in a dynamic environment of evolving data streams, (ii) a variety of recent research from
the data management and data mining communities within this framework, and identifies a range of open problems
in this area.