Tuesday, December 2, 2014

Week 12: Countability, Induction, Assignment 3

During this period I had been working on Assignment 3; the past two assignments I had not received the grade I had hoped for, so I worked extra hard on the last one in hopes of boosting my grade up a little. 

This week's lecture comprised of Countability, Induction, and review for the final

First off, countability means that there is a mapping from one set to another. That is, it satisfies one-to-one and onto, as well as being well-defined. We did an example in class to see how many Python programs are there and whether they were countable. We used "diagonalization" on a table of functions and their halting behaviour. Given a list of countably many of functions, we can always construct a function outside of the list, meaning there are uncountable programs we cannot program in Python.

Tuesday, November 25, 2014

Week 10: Further Big-proofs, Intro to Computability

We started off this lecture with a big-omega proof. It is similar to a big-oh proof, only in that we underestimate the function and over-estimate the inside of the bracket.

Takeaway:
-magic breakpoint B=1
-simplify the function without changing the highest degree

Next, we learned the proof of general statements for big-Oh: using a definition of function sets, we can group functions together. For example, all the functions we work with take in a natural number, and return a non-negative real number. For these problems, the main point is to find a relationship between the c values, B values and that of the others. A relationship (comparative) can be manipulated to prove a statement or definition.



Computability:

The next part of our lecture was on Computability. Computers cannot solve everything, as there are some functions that are impossible to define. These will cause infinite loops or/and causing the program to crash.
However, if we can reduce a function to another one, then we just need to find the computability of the reduced function.
computable implies computable, and non-computable implies non-computable.

Wednesday, November 5, 2014

Week 9: Big- Oh Proofs

In this week of lecture we learned how to prove using the definition of Big-Oh.

Using the definition, the basic concept to grasp is finding a c and B, that past the threshold B, the definition holds. Even with a constant, we let n=1 and calculate the value. For example, to prove 3n^2+2n+5 is in O(n^2), we let n=1, so the function calculates to 10. We let c=10, and B=1. For all n>1, 10n^2 will be greater than the function, so it holds.

For more complex O and functions, we meet the two in the centre. We overestimate the function, and underestimate the O. For example, for function 7n^6-5n^4+2n^3 <= 7n^6+2n^3 <= 7n^6+2n^6 = 9n^6. Vice versa for underestimating.

In order to disprove, we just prove the negation. This also involves picking a smart B and c value.

For non-polynomials, we prove using limits. To find a function not in a bigoh: We find a breakpoint n' so that for n>n', function/bigohfunction > c. That is, that beyond a threshold, the function is always bigger than the bigohfunction.

Thursday, October 30, 2014

Week 8- Definition of O and Ω, and Sorting Algorithms

This lecture we started with O and Ω.

A function that grows no faster than O(n) is in the set. For example, O(n^2) means that the function grows no faster than n^2. The more formal definition for O(n^2) is that beyond a breakpoint B, f(n) is upper bounded by cn^2, where c is a constant multiplier. We can use this as an analogy for defining O(n^3), etc.. We can prove a function is in a certain O by using its formal definition, and wisely choosing a B and c.

On the other spectrum is Ω(n). This set contains all the functions that grow no slower than n. For example, Ω(n^2) contains all the functions that grow no slower than n^2: n^2, n^3, etc..

When a function is both Ω(n) and O(n), then that function is Θ(n), meaning it belongs is n.

Growth rate rankings: 1, logn, radn, n, nlogn, n^2, n^3...2^n, n^n



For the algorithm analysis part of the lecture, we learned to calculate the steps required for insertion sort and max slice. 

Insertion sort: we take the worst case possible scenario to count the steps required. For example, there is 3i + 5 steps required in a loop inside a loop. The outer loop requires n-1 iterations, so we end up with sum from 1 to n-1 of 3i+5. We simplify it, and then we can compare it with O(n^2) or Ω(n^2) with its proper definition and finding B and c.

In max slice, we only calculated the upper and lower bounds. For upper bound, we can over-estimate the number of steps; for lower bound, we can underestimate the number of steps. The trivial margin of error does not affect the answer.

 

Thursday, October 23, 2014

Week 7: End of Proofs + Algorithms

The first thing we learned this lecture was proof by cases- covering all possibilities of a statement by splitting it into several cases. For example, if a question involves proving something for a natural number, we can split it into two cases: one where all are odd, another one where all are even.

We reviewed the different proof patterns: given what's known, what can we infer? There are two cases. Introduction is from a smaller statement, we can infer a larger statement. Elimination is the opposite of Introduction.


The next big topic we learned about was Algorithm Analysis. In this, we compared bubble and merge sort, and based on actual examples, determined which was faster and why. To compare algorithms, we should compare the growth factor; that is, for example, if we double the number of data to be sorted, how many times does runtime increase? Running time is the number of steps taken by the algorithm, but we only care about its growth factor.

Since n^2 is so large, n^2 + n is seen as the same thing. We only care about the highest order, because as the data set increase, n^2 will increase exponentially larger while the other terms will not increase as much.

Wednesday, October 15, 2014

Week 6- Further Proofs

Furthering our discussion on proofs, we learned some more structures for proofs this lecture- more complex ones, involving more thinking.

First of all, we focused on non-boolean functions. Quantifiers cannot be applied on functions, so in order to prove a function we need to be creative. Using the definition of the functions, we can define many inequalities and comparisons that we may find useful when proving a statement. The takeaway is to manipulate expressions into forms closer to those in the definition.

Next, we learned to disprove statement by finding one counter-example- similar to what we have done before. We negate  the statement, then draw examples to find a counter-example.

Lastly, we tackled limits. This is a tricky one as this involves thinking backwards to find a proof. The example in class was tough to understand, but once I read it over again the method for finding a correct delta- the hard part is finding the start point.

Monday, October 13, 2014

Week 5: Further Proofs

This week's lecture included our midterm. After the midterm, we learned some additional methods of proving.

Last week, we learned how to prove a universally quantified implication- it was simple and straightforward. This week, the first thing we learned was using contrapositive to prove an implication. For example, to prove P implies Q, we can prove not Q implies not P; it achieves the same result. We use this when the reverse direction is easier to prove than the original.

Next up, we learned the usage of contradiction- a special case of contrapositive. This is a tricky one, and it involves assuming not Q instead of Q. Using this method, we assume P implies not Q, and look for a contradiction and disproving it.

For proof of an existential, we just need to find one example that satisfies.

The overall summary of the lecture is that one proof may consist of many different structures, and that with multiple quantifiers, zoom into smaller parts of the statement.