Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Data Analysis in Experimental Techniques: Data Classification & Random Signal Analysis, Slides of Mechanical Engineering

A part of lecture notes for a university course on experimental techniques. It focuses on data analysis, specifically the classification of data and analysis of random signals. Methods for analyzing deterministic and periodic signals, as well as random signals, and calculating their mean, rms value, and cross-correlation. It also introduces the concepts of autocorrelation and power spectrum. Formulas and examples to help understand these concepts.

Typology: Slides

2012/2013

Uploaded on 04/27/2013

amla
amla 🇮🇳

4.5

(12)

75 documents

1 / 12

Toggle sidebar

Related documents


Partial preview of the text

Download Data Analysis in Experimental Techniques: Data Classification & Random Signal Analysis and more Slides Mechanical Engineering in PDF only on Docsity! Objectives_template file:///G|/optical_measurement/lecture3/3_1.htm[5/7/2012 11:51:23 AM] Module 1: Introduction to Experimental Techniques Lecture 3: Data analysis The Lecture Contains: Data Analysis Classification of Data Analysis of Random Signals Fourier Transform Technique Probability Density Function Approach Objectives_template file:///G|/optical_measurement/lecture3/3_2.htm[5/7/2012 11:51:24 AM] Module 1: Introduction to Experimental Techniques Lecture 3: Data analysis Classification of data Data received by an observer from an experimental setup can be classified as in Figure 1.7. Methods of analyzing deterministic data are well-established because the data is already in a form, from which integral measures can be extracted. When periodic signals are encountered it is a conventional practice to present results for sinusoidal signals alone. This is because results for a general periodic signal can be constructed from those for harmonic signals using Fourier decomposition of the form Here is the time period of the signal and the Fourier coefficients satisfy the condition , The coefficients can be determined from the formula As an example, pressure drop in a pipe carrying pulsatile flow can be determined as a weighted average of the individual pressure drops occurring in sinusoidally varying flows whose frequencies are integer multiples of that of the real problem. Figure 1.7: Classification of Data When the data available to the observer is random, one is forced to use statistical techniques. This is because even when a mean value is determinable, one requires prior knowledge of the length of the signal to be considered for averaging. This mean value can subsequently be used for deterministic analysis. However in many applications information the randomness itself may be desired and statistical measures of the signal will have to be calculated. Objectives_template file:///G|/optical_measurement/lecture3/3_5.htm[5/7/2012 11:51:25 AM] Autocorrelation: Module 1: Introduction to Experimental Techniques Lecture 3: Data analysis At a given instant, point `a' moves up while point `b' moves down. After a time interval these directions are reversed. Hence is a measure of a time period of fluctuation of an eddy of size of and is the associated frequency. A distribution of eddy sizes now means that there exists a distribution of frequencies as well. In a boundary-layer, where is the boundary-layer thickness and the largest value of the time period can be estimated conservatively as . In flow past a cylinder may be chosen as the cylinder diameter; in flow past a mesh the grid size or the wire diameter whichever is larger can be used as an estimate of . As a rule of thumb the integration time should be 5 to 10 times the characteristic time period . Other quantities that are frequently required in the study of stationary random signals with a zero mean value are the autocorrelation and power spectrum. These are defined below. Power spectrum: : is the fraction of the kinetic energy present in the frequency interval . The largest value of occurs when . For larger values of is only partly correlated with itself and in general as , . Signals for which finite and non-zero are said to be coherent since two widely separated events on the time scale continue to bear a relationship to each other. The quantity is called the integral time scale and is a measure of the time period over which the signal is correlated with itself. The total time for which the signal is acquired should be larger than , so that the statistics are meaningfully evaluated. Objectives_template file:///G|/optical_measurement/lecture3/3_6.htm[5/7/2012 11:51:25 AM] Module 1: Introduction to Experimental Techniques Lecture 3: Data analysis Fourier Transform Technique The complex function is defined as the Fourier transform of and is calculated as The normalized power spectrum can then be calculated as where is the RMS value of . It is possible to show that and from a Fourier transform pair, i.e. Methods of calculating Fourier transforms are well-established. In particular, the fast Fourier transform (FFT) algorithm has found wide usage both in software and in hardware applications in signal processing. Hence it is to be understood that integrals appearing in the Fourier transforms defined above can be readily determined. Though the integrals given above are complex-valued, the property guarantees that is purely real. On the other hand, the Fourier integral for is to be interpreted as the real part of the complex function. Typical autocorrelation functions and power spectrum are sketched in Figure 1.10. Objectives_template file:///G|/optical_measurement/lecture3/3_7.htm[5/7/2012 11:51:25 AM] Module 1: Introduction to Experimental Techniques Lecture 3: Data analysis For a sinusoidal signal the power spectrum exhibits a peak at the signal frequency . This suggests a method of measuring frequency of sinusoidal signals and dominant frequencies of non-sinusoidal periodic signals. White noise is defined as a signal whose amplitude at a given instant is purely a random variable within with certain limits. Hence the signal is correlated with itself when and uncorrelated for all . The cross-correlation function for a pair of signals and is defined as Figure 1.10: Examples of Autocorrelation and Power spectrum. Here is a band-limited function that is zero if where is a prescribed large value. Objectives_template file:///G|/optical_measurement/lecture3/3_10.htm[5/7/2012 11:51:26 AM] Module 1: Introduction to Experimental Techniques Lecture 3: Data analysis A Gaussian signal is one whose probability density function has a Gaussian profile. Such signals have a finite range of values of time lag over which the autocorrelation is non-zero. Additionally, the power spectrum, interpreted as the harmonically decomposed kinetic energy, is spread over a range of frequencies. The central limit theorem of probability theory is worth recalling in this connection. This theorem states that a large number of identically distributed independent variables will together have a Gaussian probability density function regardless of the shape of the density of the variables themselves. Signals in homogeneous, stationary turbulent flow that exhibit equilibrium between energy production and dissipation are known to exhibit a Gaussain probability density function . Hence deviation from Gaussain behaviour can be used as a measure of deviation from equilibrium itself. The shape of a Gaussian PDF for a zero-mean signal is given by the formula and is sketched in Figure 1.12. Here, is the RMS value of Figure 1.12: An Example of Gaussian PDF. Objectives_template file:///G|/optical_measurement/lecture3/3_11.htm[5/7/2012 11:51:27 AM] Module 1: Introduction to Experimental Techniques Lecture 3: Data analysis In the definitions given in the previous slide for quantities such as and the signal is available in digital form and stored in a computer. Many of these integrals can instead be evaluated in terms of the probability density function of the signal . The advantages of this approach are: 1. can be determined using hardware (instruments) and, 2. is a usually a smooth function of its argument and hence integrals involving can be accurately calculated by high order numerical integration formulas. However, the difficulty of having a long enough signal for is now transferred to waiting for a long enough time to determine . The accuracy with which is measured depends on the choice of the window and total time . In general, for small values a large value of time is required for satisfactory convergence of the limit process arising in the definition of . In terms of the mean and RMS values are defined as follows: For signals that do not have a zero mean The th order moment of a signal with a zero mean is defined as Objectives_template file:///G|/optical_measurement/lecture3/3_12.htm[5/7/2012 11:51:27 AM] Module 1: Introduction to Experimental Techniques Lecture 3: Data analysis The second order moment of namely the mean square. The third moment is called the skewness factor. This is zero for a Gaussian signal. The fourth moment is called flatness factor or Kurtosis. Note that as increases the accuracy with which is determined for large becomes critical. Typical examples where the skewness and flatness factors are respectively large as shown in Figure 1.13. Figure 1.13: Signals with Large Skewness (a) and Large Flatness (b). The cross correlation is determined in terms of PDF as where is called the joint probability density function. It is defined as the fraction of the time for which lies between and and between and simultaneously. Autocorrelation can be determined in terms of by identifying as . Note that the PDF approach evaluates integrals in the amplitude domain alone. In comparison, the autocorrelation function represents time-domain statistics; the spectra are descriptors of the flow field in the frequency domain.
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved