-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Operational error no such table: cu.area #73
Comments
How did you install the library? I just installed it from scratch in a fresh pipenv environment and got no errors. See:
|
If you ran an update, it looks like we are getting blocked from downloading by bot detection added to the BLS website.
|
I've attempted to patch this bug, and better test for its future recurrence. Please upgrade to version 1.0.18 and try again. Assuming it's fixed, I'll close this ticket. If you still have the problem, please speak up. |
When I try importing the CPI library after installation it gives me the error:
OperationalError Traceback (most recent call last)
/var/folders/cx/c0rnck312cxbw_4wbq7ylfw80000gn/T/ipykernel_38424/1241436912.py in
----> 1 import cpi
~/opt/anaconda3/lib/python3.9/site-packages/cpi/init.py in
19 # Parse data for use
20 logger.info("Parsing data files from the BLS")
---> 21 areas = parsers.ParseArea().parse()
22 items = parsers.ParseItem().parse()
23 periods = parsers.ParsePeriod().parse()
~/opt/anaconda3/lib/python3.9/site-packages/cpi/parsers.py in parse(self)
58 logger.debug("Parsing area file")
59 object_list = MappingList()
---> 60 for row in self.get_file("cu.area"):
61 obj = Area(row["area_code"], row["area_name"])
62 object_list.append(obj)
~/opt/anaconda3/lib/python3.9/site-packages/cpi/parsers.py in get_file(self, file)
35
36 # Query this file
---> 37 query = cursor.execute(f'SELECT * FROM "{file}"')
38 columns = [d[0] for d in query.description]
39 result_list = [dict(zip(columns, r)) for r in query.fetchall()]
OperationalError: no such table: cu.area
Could someone please help me fix this error
The text was updated successfully, but these errors were encountered: