Home Page Icon
Home Page
Table of Contents for
Cover image
Close
Cover image
by Joshua Menke, William Menke
Environmental Data Analysis with MatLab, 2nd Edition
Cover image
Title page
Table of Contents
Copyright
Dedication
Preface
Advice on scripting for beginners
1: Data analysis with MatLab
Abstract
1.1 Why MatLab?
1.2 Getting started with MatLab
1.3 Getting organized
1.4 Navigating folders
1.5 Simple arithmetic and algebra
1.6 Vectors and matrices
1.7 Multiplication of vectors of matrices
1.8 Element access
1.9 Representing functions
1.10 To loop or not to loop
1.11 The matrix inverse
1.12 Loading data from a file
1.13 Plotting data
1.14 Saving data to a file
1.15 Some advice on writing scripts
Problems
2: A first look at data
Abstract
2.1 Look at your data!
2.2 More on MatLab graphics
2.3 Rate information
2.4 Scatter plots and their limitations
Problems
3: Probability and what it has to do with data analysis
Abstract
3.1 Random variables
3.2 Mean, median, and mode
3.3 Variance
3.4 Two important probability density functions
3.5 Functions of a random variable
3.6 Joint probabilities
3.7 Bayesian inference
3.8 Joint probability density functions
3.9 Covariance
3.10 Multivariate distributions
3.11 The multivariate Normal distributions
3.12 Linear functions of multivariate data
Problems
4: The power of linear models
Abstract
4.1 Quantitative models, data, and model parameters
4.2 The simplest of quantitative models
4.3 Curve fitting
4.4 Mixtures
4.5 Weighted averages
4.6 Examining error
4.7 Least squares
4.8 Examples
4.9 Covariance and the behavior of error
Problems
5: Quantifying preconceptions
Abstract
5.1 When least square fails
5.2 Prior information
5.3 Bayesian inference
5.4 The product of Normal probability density distributions
5.5 Generalized least squares
5.6 The role of the covariance of the data
5.7 Smoothness as prior information
5.8 Sparse matrices
5.9 Reorganizing grids of model parameters
Problems
6: Detecting periodicities
Abstract
6.1 Describing sinusoidal oscillations
6.2 Models composed only of sinusoidal functions
6.3 Going complex
6.4 Lessons learned from the integral transform
6.5 Normal curve
6.6 Spikes
6.7 Area under a function
6.8 Time-delayed function
6.9 Derivative of a function
6.10 Integral of a function
6.11 Convolution
6.12 Nontransient signals
Problems
7: The past influences the present
Abstract
7.1 Behavior sensitive to past conditions
7.2 Filtering as convolution
7.3 Solving problems with filters
7.4 An example of an empirically-derived filter
7.5 Predicting the future
7.6 A parallel between filters and polynomials
7.7 Filter cascades and inverse filters
7.8 Making use of what you know
Problems
8: Patterns suggested by data
Abstract
8.1 Samples as mixtures
8.2 Determining the minimum number of factors
8.3 Application to the Atlantic Rocks dataset
8.4 Spiky factors
8.5 Weighting of elements
8.6 Q-mode factor analysis and spatial clustering
8.7 Time-Variable functions
Problems
9: Detecting correlations among data
Abstract
9.1 Correlation is covariance
9.2 Computing autocorrelation by hand
9.3 Relationship to convolution and power spectral density
9.4 Cross-correlation
9.5 Using the cross-correlation to align time series
9.6 Least squares estimation of filters
9.7 The effect of smoothing on time series
9.8 Band-pass filters
9.9 Frequency-dependent coherence
9.10 Windowing before computing Fourier transforms
9.11 Optimal window functions
Problems
10: Filling in missing data
Abstract
10.1 Interpolation requires prior information
10.2 Linear interpolation
10.3 Cubic interpolation
10.4 Kriging
10.5 Interpolation in two-dimensions
10.6 Fourier transforms in two dimensions
Problems
11: “Approximate” is not a pejorative word
Abstract
11.1 The value of approximation
11.2 Polynomial approximations and Taylor series
11.3 Small number approximations
11.4 Small number approximation applied to distance on a sphere
11.5 Small number approximation applied to variance
11.6 Taylor series in multiple dimensions
11.7 Small number approximation applied to covariance
11.8 Solving nonlinear problems with iterative least squares
11.9 Fitting a sinusoid of unknown frequency
11.10 The gradient method
11.11 Precomputation of a function and table lookups
11.12 Artificial neural networks
11.13 Information flow in a neural net
11.14 Training a neural net
11.15 Neural net for a nonlinear response
Problems
12: Are my results significant?
Abstract
12.1 The difference is due to random variation!
12.2 The distribution of the total error
12.3 Four important probability density functions
12.4 A hypothesis testing scenario
12.5 Chi-squared test for generalized least squares
12.6 Testing improvement in fit
12.7 Testing the significance of a spectral peak
12.8 Bootstrap confidence intervals
Problems
13: Notes
Abstract
Note 1.1 On the persistence of MatLab variables
Note 2.1 On time
Note 2.2 On reading complicated text files
Note 3.1 On the rule for error propagation
Note 3.2 On the eda_draw() function
Note 4.1 On complex least squares
Note 5.1 On the derivation of generalized least squares
Note 5.2 On MatLab functions
Note 5.3 On reorganizing matrices
Note 6.1 On the MatLab atan2() function
Note 6.2 On the orthonormality of the discrete Fourier data kernel
Note 6.3 On the expansion of a function in an orthonormal basis
Note 8.1 On singular value decomposition
Note 9.1 On coherence
Note 9.2 On Lagrange multipliers
Note 11.1 On the chain rule for partial derivatives
Index
Search in book...
Toggle Font Controls
Playlists
Add To
Create new playlist
Name your new playlist
Playlist description (optional)
Cancel
Create playlist
Sign In
Email address
Password
Forgot Password?
Create account
Login
or
Continue with Facebook
Continue with Google
Sign Up
Full Name
Email address
Confirm Email Address
Password
Login
Create account
or
Continue with Facebook
Continue with Google
Next
Next Chapter
Title page
Add Highlight
No Comment
..................Content has been hidden....................
You can't read the all page of ebook, please click
here
login for view all page.
Day Mode
Cloud Mode
Night Mode
Reset