Note:
Usually, this homework would mostly involve problems from last weeks
lecture (strong induction, structural induction),
plus a few reading problems.
This week has no (non-extra-credit) problem froms last week's lecture.
The reason is that although it's an important concept,
the other 122 section isn't covering that material,
and I want to keep the homeworks roughly equivalent between sections.
...of course).
if
-then
statements are fine.
Tree
in class,
boolean contains(String target)
.
boolean countOccurrences(String target)
.
append
for
SList.
EmptyTree
and for Branch
.
Right down the recursive call, and then think about how to use that
result to constructor your overall result.
You might find it helpful to write out a test case,
and the result of the recursive call on the test case.)
Extra Credit, continuing from previous:
Use structural induction to prove that for any SList
s a,b,c,
(a.append(b)).append(c)
a.append( b.append(c) )
show
, that is
a++(b++c) = (a++b)++c, where append
is commutative++
stands for
append
.
As in
the structural
induction from lecture,
you'll start with a statement P(sl).
Then you'll have two main branches:
el
) holds,
where et
is any EmptyList
,
c
) holds,
where c
is any Cons
;
your inductive hypothesis is
P(c.rest
).
(As in the notes and Rosen,
write out what this statement is,
before showing that
P(c.rest
) → P(c
).)
...of course).
if
-then
statements are fine.
...or
Σ.)
if
or case
statement, to handle the base case(s) correctly.)
First, look at some sites like tinyurl.com or snurl.com which take long URLs and create new short names for them. You might wonder how quickly such sites will run out of short names.
If there are 1010 static web pages, and every single one of them were registered on such a site, how long would be the longest small URL be? (Just count the encoded part of the name, without the initial ``http://www.snurl.com/'' prefix.)
We might then start to wonder, is 1010 a reasonable upper estimate on the number of web pages? After all, one of the best use of these sites is for abbreviating dynamic URLs which contain (long) queries embedded inside them (e.g. mapquest URLs). So we'll ask the same question, but now suppose that 1010 people each generate one dynamic request per second, for 1010sec (about 325 years — e.g. since William of Orange took the English throne in 1688). How long will the longest short-URLs be now?
To think about to yourself:
Does one of these sites do a better job than the other?
Which do you prefer?