Skip to content

Commit

Permalink
Merge branch 'master' into dagoba-typeset
Browse files Browse the repository at this point in the history
Conflicts:
	build.py
	tex/500L.tex
  • Loading branch information
MichaelDiBernardo committed Dec 22, 2015
2 parents fbac0dc + bff4784 commit 5fa3e97
Show file tree
Hide file tree
Showing 22 changed files with 3,126 additions and 75 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,8 @@ tex/modeller-images
tex/modeller.markdown
tex/objmodel-images
tex/objmodel.markdown
tex/ocr-images
tex/ocr.markdown
tex/pedometer-images
tex/pedometer.markdown
tex/sample-images
Expand Down
48 changes: 25 additions & 23 deletions blockcode/blockcode.markdown

Large diffs are not rendered by default.

4 changes: 3 additions & 1 deletion build.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ def main(chapters=[], epub=False, pdf=False, html=False, mobi=False, pandoc_epub

chapter_dirs = [
'dagoba',
'ocr',
'contingent',
'same-origin-policy',
'blockcode',
Expand Down Expand Up @@ -68,6 +69,7 @@ def main(chapters=[], epub=False, pdf=False, html=False, mobi=False, pandoc_epub
]

image_paths = [
'./ocr/ocr-images',
'./contingent/contingent-images',
'./same-origin-policy/same-origin-policy-images',
'./blockcode/blockcode-images',
Expand Down Expand Up @@ -193,7 +195,7 @@ def build_mobi():
def build_html(chapter_markdowns):
run('mkdir -p html/content/pages')
temp = 'python _build/preprocessor.py --chapter {chap} --html-refs --html-paths --output={md}.1 --latex {md}'
temp2 = 'pandoc --csl=minutiae/ieee.csl --bibliography=tex/500L.bib -t html -f markdown+citations -o html/content/pages/{basename}.md {md}.1'
temp2 = 'pandoc --csl=minutiae/ieee.csl --mathjax --bibliography=tex/500L.bib -t html -f markdown+citations -o html/content/pages/{basename}.md {md}.1'
temp3 = './_build/fix_html_title.sh html/content/pages/{basename}.md'
for i, markdown in enumerate(chapter_markdowns):
basename = os.path.splitext(os.path.split(markdown)[1])[0]
Expand Down
4 changes: 2 additions & 2 deletions ci/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Copy the tests/ folder from this code base to test_repo and commit it::
cp -r /this/directory/tests /path/to/test_repo/
cd /path/to/test_repo
git add tests/
git commit -madd tests
git commit -m "add tests"

The repo observer will need its own clone of the code::

Expand Down Expand Up @@ -110,7 +110,7 @@ to make a new commit. Go to your master repo and make an arbitrary change::
cd /path/to/test_repo
touch new_file
git add new_file
git commit -m"new file" new_file
git commit -m "new file" new_file

then repo_observer.py will realize that there's a new commit and will notify
the dispatcher. You can see the output in their respective shells, so you
Expand Down
4 changes: 2 additions & 2 deletions ci/ci.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ Copy the tests folder from this code base to `test_repo` and commit it:
$ cp -r /this/directory/tests /path/to/test_repo/
$ cd /path/to/test\_repo
$ git add tests/
$ git commit -m”add tests”
$ git commit -m ”add tests”
```

Now you have a commit in the master repository.
Expand Down Expand Up @@ -227,7 +227,7 @@ modified this assumption for simplicity.

The observer must know which repository to observe. We previously
created a clone of our repository at `/path/to/test_repo_clone_obs`.
The repository will use this clone to detect changes. To allow the
The observer will use this clone to detect changes. To allow the
repository observer to use this clone, we pass it the path when we
invoke the `repo_observer.py` file. The repository observer will use this
clone to pull from the main repository.
Expand Down
32 changes: 12 additions & 20 deletions contingent/contingent.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,7 @@ things; he loves seeing the spark of wonder and delight in people's eyes when
someone shares a novel, surprising, or beautiful idea. Daniel lives in Atlanta
with a microbiologist and four aspiring rocketeers._

Introduction
============
## Introduction

Build systems have long been a standard tool
within computer programming.
Expand All @@ -32,7 +31,7 @@ It not only lets you declare
that an output file depends upon one (or more) inputs,
but lets you do this recursively.
A program, for example, might depend upon an object file
which itself depends upon the corresponding source code::
which itself depends upon the corresponding source code:

```
prog: main.o
Expand Down Expand Up @@ -70,8 +69,7 @@ The problem, again, is cross-referencing.
Where do cross-references tend to emerge?
In text documents, documentation, and printed books!

The Problem: Building Document Systems
======================================
## The Problem: Building Document Systems

Systems to rebuild formatted documents from source texts
always seem to do too much work, or too little.
Expand Down Expand Up @@ -132,7 +130,7 @@ If you later reconsider the tutorial’s chapter title —
after all, the word “newcomer” sounds so antique,
as if your users are settlers who have just arrived in pioneer Wyoming —
then you would edit the first line of `tutorial.rst`
and write something better::
and write something better:

```
-Newcomers Tutorial
Expand Down Expand Up @@ -279,8 +277,7 @@ This can happen for many kinds of cross reference that Sphinx supports:
chapter titles, section titles, paragraphs,
classes, methods, and functions.

Build Systems and Consistency
=============================
## Build Systems and Consistency

The problem outlined above is not specific to Sphinx.
Not only does it haunt other document systems, like LaTeX,
Expand All @@ -290,7 +287,7 @@ with the venerable `make` utility,
if their assets happen to cross-reference in interesting ways.

As the problem is ancient and universal,
its solution is of equally long lineage::
its solution is of equally long lineage:

```bash
$ rm -r _build/
Expand Down Expand Up @@ -332,8 +329,7 @@ while performing the fewest possible rebuild steps.
While Contingent can be applied to any problem domain,
we will run it against a small version of the problem outlined above.

Linking Tasks To Make a Graph
=============================
## Linking Tasks To Make a Graph

Any build system needs a way to link inputs and outputs.
The three markup texts in our discussion above,
Expand Down Expand Up @@ -544,8 +540,7 @@ at either end of the edge.
But in return for this redundancy,
the data structure supports the fast lookup that Contingent needs.

The Proper Use of Classes
=========================
## The Proper Use of Classes

You may have been surprised
by the absence of classes in the above discussion
Expand Down Expand Up @@ -637,7 +632,7 @@ and that the nodes themselves in these early examples
are simply strings.
Coming from other languages and traditions,
one might have expected to see
user-defined classes and interfaces for everything in the system::
user-defined classes and interfaces for everything in the system:

```java
Graph g = new ConcreteGraph();
Expand Down Expand Up @@ -862,8 +857,7 @@ will eventually have Contingent do for us:
the graph `g` captures the inputs and consequences
for the various artifacts in our project's documentation.

Learning Connections
====================
## Learning Connections

We now have a way for Contingent
to keep track of tasks and the relationships between them.
Expand Down Expand Up @@ -1311,8 +1305,7 @@ at its disposal,
Contingent knows all the things to rebuild
if the inputs to any tasks change.

Chasing Consequences
====================
## Chasing Consequences

Once the initial build has run to completion,
Contingent needs to monitor the input files for changes.
Expand Down Expand Up @@ -1542,8 +1535,7 @@ nevertheless returned the same value means that all further
downstream tasks were insulated from the change
and did not get re-invoked.

Conclusion
==========
## Conclusion

There exist languages and programming methodologies
under which Contingent would be a suffocating forest of tiny classes
Expand Down
14 changes: 7 additions & 7 deletions functionalDB/functionalDB.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -721,7 +721,7 @@ Our data model is based on accumulation of facts (i.e., datoms) over time. For t

### Query Language

Let's look at an example query in our proposed language. This query asks: "What are the names and birthday of entities who like pizza, speak English, and who have a birthday this month?"
Let's look at an example query in our proposed language. This query asks: "What are the names and birthdays of entities who like pizza, speak English, and who have a birthday this month?"
```clojure
{ :find [?nm ?bd ]
:where [
Expand Down Expand Up @@ -1281,12 +1281,12 @@ The twist to the index structure is that now we hold a binding pair of the entit

At the end of phase 3 of our example execution, we have the following structure at hand:
```clojure
{[1 "?e"] {
[:likes nil] ["Pizza" nil]
[:name nil] ["USA" "?nm"]
[:speaks nil] ["English" nil]
[:birthday nil] ["July 4, 1776" "?bd"]}
}}
{[1 "?e"]{
{[:likes nil] ["Pizza" nil]}
{[:name nil] ["USA" "?nm"]}
{[:speaks nil] ["English" nil]}
{[:birthday nil] ["July 4, 1776" "?bd"]}
}}
```

#### Phase 4: Unify and Report
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
6 changes: 3 additions & 3 deletions ocr/ocr.py → ocr/code/ocr.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,13 +75,13 @@ def train(self, training_data_array):
actual_vals = [0] * 10 # actual_vals is a python list for easy initialization and is later turned into an np matrix (2 lines down).
actual_vals[data['label']] = 1
output_errors = np.mat(actual_vals).T - np.mat(y2)
hiddenErrors = np.multiply(np.dot(np.mat(self.theta2).T, output_errors), self.sigmoid_prime(sum1))
hidden_errors = np.multiply(np.dot(np.mat(self.theta2).T, output_errors), self.sigmoid_prime(sum1))

# Step 4: Update weights
self.theta1 += self.LEARNING_RATE * np.dot(np.mat(hiddenErrors), np.mat(data['y0']))
self.theta1 += self.LEARNING_RATE * np.dot(np.mat(hidden_errors), np.mat(data['y0']))
self.theta2 += self.LEARNING_RATE * np.dot(np.mat(output_errors), np.mat(y1).T)
self.hidden_layer_bias += self.LEARNING_RATE * output_errors
self.input_layer_bias += self.LEARNING_RATE * hiddenErrors
self.input_layer_bias += self.LEARNING_RATE * hidden_errors

def predict(self, test):
y1 = np.dot(np.mat(self.theta1), np.mat(test).T)
Expand Down
4 changes: 2 additions & 2 deletions ocr/server.py → ocr/code/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@ class JSONHandler(BaseHTTPServer.BaseHTTPRequestHandler):
def do_POST(s):
response_code = 200
response = ""
varLen = int(s.headers.get('Content-Length'))
content = s.rfile.read(varLen);
var_len = int(s.headers.get('Content-Length'))
content = s.rfile.read(var_len);
payload = json.loads(content);

if payload.get('train'):
Expand Down
Binary file added ocr/ocr-images/ann.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 5fa3e97

Please sign in to comment.