Skip to content

Commit 3d00c9b

Browse files
manaskarekaraymns
authored andcommitted
Enable syntax highlighting in all python code snippets (donnemartin#268)
1 parent bc230d1 commit 3d00c9b

File tree

10 files changed

+30
-30
lines changed

10 files changed

+30
-30
lines changed

README-ja.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1166,7 +1166,7 @@ Redisはさらに以下のような機能を備えています:
11661166
* エントリをキャッシュに追加します
11671167
* エントリを返します
11681168

1169-
```
1169+
```python
11701170
def get_user(self, user_id):
11711171
user = cache.get("user.{0}", user_id)
11721172
if user is None:
@@ -1209,7 +1209,7 @@ set_user(12345, {"foo":"bar"})
12091209

12101210
キャッシュコード:
12111211

1212-
```
1212+
```python
12131213
def set_user(user_id, values):
12141214
user = db.query("UPDATE Users WHERE id = {0}", user_id, values)
12151215
cache.set(user_id, user)

README-zh-Hans.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1180,7 +1180,7 @@ Redis 有下列附加功能:
11801180
- 将查找到的结果存储到缓存中
11811181
- 返回所需内容
11821182

1183-
```
1183+
```python
11841184
def get_user(self, user_id):
11851185
user = cache.get("user.{0}", user_id)
11861186
if user is None:
@@ -1223,7 +1223,7 @@ set_user(12345, {"foo":"bar"})
12231223

12241224
缓存代码:
12251225

1226-
```
1226+
```python
12271227
def set_user(user_id, values):
12281228
user = db.query("UPDATE Users WHERE id = {0}", user_id, values)
12291229
cache.set(user_id, user)

README-zh-TW.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1167,7 +1167,7 @@ Redis 還有以下額外的功能:
11671167
* 將該筆記錄儲存到快取
11681168
* 將資料返回
11691169

1170-
```
1170+
```python
11711171
def get_user(self, user_id):
11721172
user = cache.get("user.{0}", user_id)
11731173
if user is None:
@@ -1210,7 +1210,7 @@ set_user(12345, {"foo":"bar"})
12101210

12111211
快取程式碼:
12121212

1213-
```
1213+
```python
12141214
def set_user(user_id, values):
12151215
user = db.query("UPDATE Users WHERE id = {0}", user_id, values)
12161216
cache.set(user_id, user)

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1164,7 +1164,7 @@ The application is responsible for reading and writing from storage. The cache
11641164
* Add entry to cache
11651165
* Return entry
11661166

1167-
```
1167+
```python
11681168
def get_user(self, user_id):
11691169
user = cache.get("user.{0}", user_id)
11701170
if user is None:
@@ -1201,13 +1201,13 @@ The application uses the cache as the main data store, reading and writing data
12011201

12021202
Application code:
12031203

1204-
```
1204+
```python
12051205
set_user(12345, {"foo":"bar"})
12061206
```
12071207

12081208
Cache code:
12091209

1210-
```
1210+
```python
12111211
def set_user(user_id, values):
12121212
user = db.query("UPDATE Users WHERE id = {0}", user_id, values)
12131213
cache.set(user_id, user)

solutions/system_design/mint/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -182,7 +182,7 @@ For the **Category Service**, we can seed a seller-to-category dictionary with t
182182

183183
**Clarify with your interviewer how much code you are expected to write**.
184184

185-
```
185+
```python
186186
class DefaultCategories(Enum):
187187

188188
HOUSING = 0
@@ -199,7 +199,7 @@ seller_category_map['Target'] = DefaultCategories.SHOPPING
199199

200200
For sellers not initially seeded in the map, we could use a crowdsourcing effort by evaluating the manual category overrides our users provide. We could use a heap to quickly lookup the top manual override per seller in O(1) time.
201201

202-
```
202+
```python
203203
class Categorizer(object):
204204

205205
def __init__(self, seller_category_map, self.seller_category_crowd_overrides_map):
@@ -219,7 +219,7 @@ class Categorizer(object):
219219

220220
Transaction implementation:
221221

222-
```
222+
```python
223223
class Transaction(object):
224224

225225
def __init__(self, created_at, seller, amount):
@@ -232,7 +232,7 @@ class Transaction(object):
232232

233233
To start, we could use a generic budget template that allocates category amounts based on income tiers. Using this approach, we would not have to store the 100 million budget items identified in the constraints, only those that the user overrides. If a user overrides a budget category, which we could store the override in the `TABLE budget_overrides`.
234234

235-
```
235+
```python
236236
class Budget(object):
237237

238238
def __init__(self, income):
@@ -273,7 +273,7 @@ user_id timestamp seller amount
273273

274274
**MapReduce** implementation:
275275

276-
```
276+
```python
277277
class SpendingByCategory(MRJob):
278278

279279
def __init__(self, categorizer):

solutions/system_design/pastebin/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -130,7 +130,7 @@ To generate the unique url, we could:
130130
* Base 64 is another popular encoding but provides issues for urls because of the additional `+` and `/` characters
131131
* The following [Base 62 pseudocode](http://stackoverflow.com/questions/742013/how-to-code-a-url-shortener) runs in O(k) time where k is the number of digits = 7:
132132

133-
```
133+
```python
134134
def base_encode(num, base=62):
135135
digits = []
136136
while num > 0
@@ -142,7 +142,7 @@ def base_encode(num, base=62):
142142

143143
* Take the first 7 characters of the output, which results in 62^7 possible values and should be sufficient to handle our constraint of 360 million shortlinks in 3 years:
144144

145-
```
145+
```python
146146
url = base_encode(md5(ip_address+timestamp))[:URL_LENGTH]
147147
```
148148

solutions/system_design/query_cache/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -97,7 +97,7 @@ The cache can use a doubly-linked list: new items will be added to the head whil
9797

9898
**Query API Server** implementation:
9999

100-
```
100+
```python
101101
class QueryApi(object):
102102

103103
def __init__(self, memory_cache, reverse_index_service):
@@ -121,7 +121,7 @@ class QueryApi(object):
121121

122122
**Node** implementation:
123123

124-
```
124+
```python
125125
class Node(object):
126126

127127
def __init__(self, query, results):
@@ -131,7 +131,7 @@ class Node(object):
131131

132132
**LinkedList** implementation:
133133

134-
```
134+
```python
135135
class LinkedList(object):
136136

137137
def __init__(self):
@@ -150,7 +150,7 @@ class LinkedList(object):
150150

151151
**Cache** implementation:
152152

153-
```
153+
```python
154154
class Cache(object):
155155

156156
def __init__(self, MAX_SIZE):

solutions/system_design/sales_rank/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ We'll use a multi-step **MapReduce**:
102102
* **Step 1** - Transform the data to `(category, product_id), sum(quantity)`
103103
* **Step 2** - Perform a distributed sort
104104

105-
```
105+
```python
106106
class SalesRanker(MRJob):
107107

108108
def within_past_week(self, timestamp):

solutions/system_design/social_graph/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ Handy conversion guide:
6262

6363
Without the constraint of millions of users (vertices) and billions of friend relationships (edges), we could solve this unweighted shortest path task with a general BFS approach:
6464

65-
```
65+
```python
6666
class Graph(Graph):
6767

6868
def shortest_path(self, source, dest):
@@ -117,7 +117,7 @@ We won't be able to fit all users on the same machine, we'll need to [shard](htt
117117

118118
**Lookup Service** implementation:
119119

120-
```
120+
```python
121121
class LookupService(object):
122122

123123
def __init__(self):
@@ -132,7 +132,7 @@ class LookupService(object):
132132

133133
**Person Server** implementation:
134134

135-
```
135+
```python
136136
class PersonServer(object):
137137

138138
def __init__(self):
@@ -151,7 +151,7 @@ class PersonServer(object):
151151

152152
**Person** implementation:
153153

154-
```
154+
```python
155155
class Person(object):
156156

157157
def __init__(self, id, name, friend_ids):
@@ -162,7 +162,7 @@ class Person(object):
162162

163163
**User Graph Service** implementation:
164164

165-
```
165+
```python
166166
class UserGraphService(object):
167167

168168
def __init__(self, lookup_service):

solutions/system_design/web_crawler/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,7 @@ We could store `links_to_crawl` and `crawled_links` in a key-value **NoSQL Datab
100100

101101
`PagesDataStore` is an abstraction within the **Crawler Service** that uses the **NoSQL Database**:
102102

103-
```
103+
```python
104104
class PagesDataStore(object):
105105

106106
def __init__(self, db);
@@ -134,7 +134,7 @@ class PagesDataStore(object):
134134

135135
`Page` is an abstraction within the **Crawler Service** that encapsulates a page, its contents, child urls, and signature:
136136

137-
```
137+
```python
138138
class Page(object):
139139

140140
def __init__(self, url, contents, child_urls, signature):
@@ -146,7 +146,7 @@ class Page(object):
146146

147147
`Crawler` is the main class within **Crawler Service**, composed of `Page` and `PagesDataStore`.
148148

149-
```
149+
```python
150150
class Crawler(object):
151151

152152
def __init__(self, data_store, reverse_index_queue, doc_index_queue):
@@ -187,7 +187,7 @@ We'll want to remove duplicate urls:
187187
* For smaller lists we could use something like `sort | unique`
188188
* With 1 billion links to crawl, we could use **MapReduce** to output only entries that have a frequency of 1
189189

190-
```
190+
```python
191191
class RemoveDuplicateUrls(MRJob):
192192

193193
def mapper(self, _, line):

0 commit comments

Comments
 (0)