Hi! We are BNIA-JFI.

This package was made to help with data handling.

Included

  • Functions built and used by BNIA for day to day tasks.
  • Made to be shared via IPYNB/ Google Colab notebooks with in-built examples using 100% publicly accessible data & links.
  • Online documentation and PyPi libraries created from the notebooks.

VitalSigns uses functions found in our Dataplay Module and vice-versa.

Binder Binder Binder Open Source Love svg3

NPM License Active Python Versions GitHub last commit

GitHub stars GitHub watchers GitHub forks GitHub followers

Tweet Twitter Follow

ACS Scripts

These scripts will download and clean ACS data for Baltimore and then construct indicators from the data.

AcsDownload.ipynb Mount notebook to google drive Read in ACS Meta Data from XLSX and the crosswalk data from a csv Add path to python script that performs download function Enter a year and Run the download function for every record in XLSX sheet For each dataset, remove columnID’s then save it as Raw Then, Append Community Names using a crosswalk and save again in as Clean

AcsIndicators.ipynb Mount notebook to google drive Read in ACS Meta Data from XLSX Prepare the Compare Historic Data

  • For Each Indicator
  • Grab its Meta Data
  • If the Indicator is valid for the year, and uses ACS Data, and method exists
  • retrieve the Python ACS indicator
  • Put Baltimore City at the bottom of the list
  • Write the results back into the XL dataframe
  • Save the Dataset
  • drop columns with any empty values
  • Save the Data xlsx Do comparison to historic year if exists. Write xlsx.

Usage Instructions

Install the Package

The code is on PyPI so you can install the scripts as a python library using the command:

!pip install VitalSigns geopandas

Important: Contributers should follow the maintanance instructions and will not need to run this step.

Their modules will be retrieved from the VitalSigns-GDrive repo they have mounted into their Colabs Enviornment.

Then...

Import Modules

  1. Import the installed module into your code:
from VitalSigns.acsDownload import retrieve_acs_data 
  1. use it
retrieve_acs_data(state, county, tract, tableId, year, saveAcs)

Now you could do something like merge it to another dataset!

from dataplay.merge import mergeDatasets
mergeDatasets(left_ds=False, right_ds=False, crosswalk_ds=False,  use_crosswalk = True, left_col=False, right_col=False, crosswalk_left_col = False, crosswalk_right_col = False, merge_how=False, interactive=True)

Getting Help

You can get information on the package by using the help command.

Here we look at the package's modules:

import VitalSigns help(VitalSigns)Help on package VitalSigns: NAME VitalSigns PACKAGE CONTENTS BCPSS HUD _nbdev acsDownload bidbaltimore bpd citistat cityfinance closecrawl create dhr enoch fares fdic indicators infousa liquor mdprop rbintel tidyaddr treebaltimore utils VERSION 0.0.5 FILE /usr/local/lib/python3.7/dist-packages/VitalSigns/__init__.py

Lets take a look at what functions the geoms module provides:

import VitalSigns.acsDownload help(VitalSigns.acsDownload)Help on module VitalSigns.acsDownload in VitalSigns: NAME VitalSigns.acsDownload - # AUTOGENERATED! DO NOT EDIT! File to edit: notebooks/90_ACS_Explore_and_Download.ipynb (unless otherwise specified). FUNCTIONS retrieve_acs_data(state, county, tract, tableId, year, save) DATA __all__ = ['retrieve_acs_data'] __warningregistry__ = {'version': 268, ('Passing a negative integer is... FILE /usr/local/lib/python3.7/dist-packages/VitalSigns/acsDownload.py

And here we can look at an individual function and what it expects:

import VitalSigns.acsDownload help(VitalSigns.acsDownload.retrieve_acs_data)Help on function retrieve_acs_data in module VitalSigns.acsDownload: retrieve_acs_data(state, county, tract, tableId, year, save)

Examples

So heres an example:

Import your modules

from VitalSigns.acsDownload import retrieve_acs_data#hide import pandas as pd pd.set_option('display.max_rows', 10) pd.set_option('display.max_columns', 6) pd.set_option('display.width', 10) pd.set_option('max_colwidth', 20)

Read in some data

Define our download parameters.

More information on these parameters can be found in the tutorials!

# Our download function will use Baltimore City's tract, county and state as internal paramters # Change these values in the cell below using different geographic reference codes will change those parameters tract = '*' county = '510' state = '24' # Specify the download parameters the function will receieve here tableId = 'B19001' year = '17' saveAcs = False

And download the Baltimore City ACS data using the imported VitalSigns library.

df = retrieve_acs_data(state, county, tract, tableId, year, saveAcs) df.head(1)Number of Columns 17
B19001_001E_Total B19001_002E_Total_Less_than_$10,000 B19001_003E_Total_$10,000_to_$14,999 ... state county tract
NAME
Census Tract 2710.02 1510 209 73 ... 24 510 271002

1 rows × 20 columns

From there, you can go on to do even greater things using our dataplay library. Like these visuals:

Have Fun!

MISC

This section is not definite but provides a good idea of how our scripts are made.

Basic Indicator Creation Outline

  • ? Count = 1
  • Create the num and denom
  • filter num denom
  • ? sum/ median = ungrouped.median
  • group by csa
  • ? bcity = median or sum
  • perform the calculation
  • compare years

Miscellaneous things I should have for every notebook

  • Module/filenames need to be fixed.
  • RB Intel has the best prelim analysis script. The others are messed up a bit?
  • include links indicator Esri and Bnia pages details on category, name, description, years
  • Don’t drop columns at end, but keep selected at beginning.
  • Merge on CSA for ordering
  • Bcity Median gets calculated before aggregation. Appended after
  • Add Years in header. Use denom and numerator as var names.
  • Code to compare past years

FOR CONTRIBUTERS

Dev Instructions

From a local copy of the git repo: 0. Clone the repo local onto GDrive

  • Via Direct-DL&Drive-Upload or Colab/Terminal/Git
  • git clone https://github.com/BNIA/VitalSigns.git
  1. Update the the IPYNB
  • From the GDrive VitalSigns folder via Colabs
  1. Build the new libraries from these NBs
  • Using this index.ipynb
    • Mount the Colabs Enviornment (and navigate to) the GDrive VitalSigns folder
    • run !nbdev_build_lib to build .py modules.
  1. Test the Library/ Modules
  • Using the same runtime as step 2's index.ipynb.
    • Do not install the module from PyPi (if published) and then...
    • Import your module ( from your VitalSigns/VitalSigns)
    • If everything runs properly, go to step 5.
  1. Edit modules directly
  • Within the same runtime as step 2/3's index.ipynb...
    • Locate the VitalSigns/VitalSigns using the colab file nav
    • double-click the .py modules in the file nav to open them in an in-browser editor
  • Make changes and return to step 3 with the following caveat:
    • Use the hot module reloading to ensure updates are auto-re-imported
    • %load_ext autoreload %autoreload 2
  • Then when finished, persist the changes from the .py modules back to the .ipynb docs
    • via !nbdev_update_lib and !relimport2name
  1. Create Docs, Push to Github, and Publish to PyPI
  • All done via nbdev
  • Find more notes I made on that here: dataplay > nbdev notes
  • !nbdev_build_docs --force_all True --mk_readme True
  • !git commit -m ...
  • %%capture ! pip install twine
  • !nbdev_bump_version
  • ! make pypi
# https://nbdev.fast.ai/tutorial.html#Set-up-prerequisites # settings.ini > requirements = fastcore>=1.0.5 torchvision<0.7 # https://nbdev.fast.ai/tutorial.html#View-docs-locally # console_scripts = nbdev_build_lib=nbdev.cli:nbdev_build_lib # https://nbdev.fast.ai/search

Dev Scripts

#hide !pip install nbdev from google.colab import drive drive.mount('/content/drive') %cd /content/drive/My Drive/'Software Development Documents'/ %cd VitalSigns %ls#hide # this will reload imported modules whenever the .py file changes. # whenever the .py file changes via nbdev_build_lib or _update_lib. %load_ext autoreload %autoreload 2#hide # !nbdev_build_lib # !nbdev_build_docs --force_all True --mk_readme True # !nbdev_nb2md 'notebooks/index.ipynb' > README.md#hide # https://nbdev.fast.ai/tutorial.html#Add-in-notebook-export-cell # https://nbdev.fast.ai/sync#nbdev_update_lib # first. builds the .py files from from .ipynbs # ____ !nbdev_build_lib # --fname filename.ipynb # second. Push .pu changes back to their original .ipynbs # ____ !nbdev_update_lib # sometimes. Update .ipynb import statements if the .py filename.classname changes. # ____ !relimport2name # nbdev_build_docs builds the documentation from the notebooks # ____ !nbdev_build_docs --force_all True --mk_readme True #hide """ ! git add * ! git config --global user.name "bnia" ! git config --global user.email "charles.karpati@gmail.com" ! git commit -m "initial commit" # git push -f origin master ! git push -u ORIGIN main """#hide ! pip install twine # ! nbdev_bump_version ! make pypi

SEARCH

CONNECT WITH US

DONATE

Help us keep this resource free and available to the public. Donate now!

Donate to BNIA-JFI

CONTACT US

Baltimore Neighborhood Indicators Alliance
The Jacob France Institute
1420 N. Charles Street, Baltimore, MD 21201
410-837-4377 | bnia-jfi@ubalt.edu