Formal Methods II

CSE3305




Search Algorithms

0 comments

Could be in 試験
f(a)
..... O(a)
log2 a


You have to think of the input SIZE not the input VALUE.

So the proper complexity would be O(2^a)

ie
g(x) = { O(x) x prime
O(x^2) Not prime

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

Local Optima

When you throw those techniques at local optima, you get local optima! So you can use generic local search to try and get yourself out of this.

Greedy Search
You have a partial solution, and you build up a complete solution from that.
Local search is greedy.
Systematic or exhaustive searches are not greedy.
Dijkstras. etc. It can escape local optima.

Korbs version of MF's Chapter 5

Iterated Hill Climber
Stochastic hill climber


Simulations

0 comments

Why simulate?
We're being showed techniques for

The fraction of problems that can be solved exactly with a sane amount of effort is quite small. -N. Gershenfeld.

Answer: Why work hard when a slave (computer simulation) can do it for you?

We've already given up on exactness!

So what can we do?

1. Analytic Integration
2. Numerical Integration.
Computationally intensive in multiple dimensions.
Analog Computation
It's not that bad if you can draw accurately! You can get an approximation of the answer.

Monte Carlo Integration
It's particularly efficient and accurate, especially for high dimensional functions, as long as the function is reasonably well behaved.

Integral can be estimated by randomly selecting n points and taking the average.

So all slide 10 is saying is that you are taking the average of it
.
.
...................................
.
. this area
.
...................................
a b

Monte Carlo Integration directly applies to multidimensional f.

13
Buffon was tossing needles and used this to come up with an estimation of π.

Turns out that they needed only really small sample sizes. But it turned out that they weren't estimating π at all!

So it doesn't really work if you know the answer already. That's what he means by, if you just stare at these numbers, you never really know when you're done.

You should do repeated examples and estimate variance. If it's .0025, you'll have 3 digit accuracy, and you're done.

16
What do you do about controlling your variance?
Increase the sample size! If you do this enough, you'll eventually get convergence. But with complex problems, you could be waiting billions of years for this.

Just showing that variance is controlled by sample size. Variance decreases directly proportional to the sample size.

Pseudo random numbers are simply too random! In 2 dimensional random noise, you still get clumps and stuff which are radiation and gravitation of super galaxies etc on a TV, for example.
In quasi-random, we help them clump together quicker.

We are just being pointed to these techniques.

18
Random walks have been used to model even such things as the stock market.

neo-conservatives

Some other applications of random walks
Behaviour of liquids and gasses in physics.


Lecture

0 comments

Venn’s identity “probability = frequency” clearly fails
in small samples.
(or 1) just because the coin
was only flipped once.
von Mises moved to probabilities as limit frequencies in
large samples.

Definition of a collective:
An infinite sequence of outcomes S = (s1, ...., Sn, ...) of a repeatable experiment (where {si} is the set of all its possible outcomes), satisfying the conditions:
1. Axiom of Convergence

Let be the set of all possible outcomes of a repeat-
able experiment. Let the sequence be
composed of outcomes in that set (each for some
). Assuming satisfies von Mises’ definition for a
collective, show that the corresponding probability function
satisfies Kolmogorov’s axioms for a probability function.
That is, taking defined over all the singleton sets of out-
comes and their unions, show that it satisfies


Last posts

Archives

Links