How to Upload File From Dropbox Into Jupyter Notebook

Data Science in the Real World.

The story of how I concluded up with creating dashboards of dreams from notebook afterwards trying and failing many of the means. Here I share how I connected my dataframes to Tableau. And promise this blog helps the ones struggling between notebooks and dashboards.

Photo by Luke Chesser on Unsplash

Photo past Luke Chesser on Unsplash

The big picture show of data analysis is not merely information and deriving insights merely also making that data understandable by everyone. Back and then existence a beginner to data science I was introduced to Jupyter notebooks. Notebooks are just like any other presentation media. Information technology consists of cells where you can put your code and then run and output just below the jail cell also information technology comes with a markdown feature where yous can document your code with basic HTML. These notebooks are used for exploratory data analysis where it is easy to share your piece of work with teammates. But it is yet non a ready to go mode for non-technical users though it has libraries like Plotly, matplotlib, etc.

For case, take the case of COVID-nineteen. Everyone beyond the earth is concerned about knowing the daily condition of their state. Will they choose a notebook that is way more documented with cells, code, or a dashboard that shares your analyzed visualizations in an interactive form? Which is why every data analysts need to have storytelling skills in their pockets. Dashboarding is simply similar storytelling, you are framing a story from the data you have fatigued which should be presented in a fashion that it doesn't let them take their eyes off.

So the question is how practice I connect my jupyter notebook to Tableau without going through the installation of python in Tableau?

The reply is simple. All that Tableau takes is data. A dataframe from jupyterlab can be saved in a format like CSV, Xls, JSON only in Real-time analysis when the data keeps updating on day to twenty-four hours ground it will be inefficient to download information daily from jupyter notebook and uploading it into Tableau.

Just there is a solution though notebooks don't requite a database back up every bit their sole purpose is analyzing datasets.

How almost these analyzed data sets which are ready to visualize is send to google sheets using google API. Thus even if the data keeps updating on each refresh of our notebook the data nosotros read from an online portal volition be saved to google sheets automatically after assay and tableau can be connected to these datasheets.

and then the task is as simple for creating a real-time dashboard. All you got to do is

  • Create a notebook
  • Read the information from an online portal data API to a pandas dataframe
  • Analyze and create a ready to become make clean and precise datagram.
  • Send the information to google sheets.
  • Let's import the post-obit libraries to our notebook
          import gspread
from oauth2client.service_account import ServiceAccountCredentials
from df2gspread import df2gspread as d2g
  • Now before anything become credential access to your sheets by following instructions:
  1. Head to Google Developers Console and create a new project (or select the one you already take).

2. Under "APIs & Services > Library", search for "Drive API" and enable it.

iii. Under "APIs & Services > Library", search for "Sheets API" and enable it.

4. Enable API Access for a Project if you haven't done it yet.

5. Get to "APIs & Services > Credentials" and choose "Create credentials > Service business relationship key".

6. Make full out the form

vii. Click "Create key"

8. Select "JSON" and click "Create"

You will automatically download a JSON file with credentials. It may look like this:

          {
"type": "service_account",
"project_id": "api-project-XXX",
"private_key_id": "5ty … bg4",
"private_key": "-----Begin Private KEY-----\northNrDyLw … jINQh/9\due north-----END Individual KEY-----\n",
"client_email": "443000000000-msmhj@programmer.gserviceaccount.com",
"client_id": "445 … hard disk.apps.googleusercontent.com",
...
}

Think the path to the downloaded credentials file. As well in the side by side footstep, you'll demand the value of client_email from this file. Go to your spreadsheet and share information technology with a client_email from the step above.

  • Create a new google sail re-create the central to a variable named spreadsheet_key. For eg: (the text in bold will exist the primal for my sheet.To be institute in the link which you open your sheets)
          https://docs.google.com/spreadsheets/d/1vd6hDT6k8lcWNMDYz13kqW7zXnIX5TVED9–4Kq7EAQnvI/edit#gid=0        
  • Add the following code to your python script or notebook. Also, a friendly reminder that the JSON file you lot just downloaded and the python script should be in the same folder before copying the post-obit code.
          scope = ['https://spreadsheets.google.com/feeds',
'https://www.googleapis.com/auth/bulldoze']
credentials = ServiceAccountCredentials.from_json_keyfile_name(
'myfile.json', scopes=scope)
gc = gspread.authorize(credentials)
spreadsheet_key='1vd6hDT6k8lcWNMDYz13kqW7zXnIX5TVED9–4Kq7EAQnvI'
  • At present make clean your datasets and save prepare to visualize data to a dataframe.
  • Copy and paste the following code to send your dataframe to google sheets.
          wks_name= "nameofyoursheet"
d2g.upload(df, spreadsheet_key, wks_name, credentials=credentials, row_names=True)
print("Data upload success")

wks_name will be your sheet'southward name and df will be your dataframe variable. So now when each time your notebook is refreshed the data gets updated automatically to the sheets equally of your data API from which yous are reading data to your notebook thus making real-time analysis with jupyter notebook easy.

  • After doing the in a higher place steps, open Tableau and connect your google sheets. To download Tableau click here
  • To connect, get to server>google sheets

Later Adding a google sheet to Tableau data connectedness (information source: OGD Platform India)
  • Create your visualizations in Tableau using the dataframe from sheets. Each visualization tin can be done in sheets which is what they call in Tableau. These sheets can be dragged and dropped into a dashboard. Publish your dashboard to Tableau Public and you are done.

Now equally the information changes every day all we take to do is refresh the notebook and Tableau data connection. All your visualizations gets updated automatically.

One advantage which Tableau has over other visualization libraries like Plotly and matplotlib is that you get visualizations with zero coding and information technology's something that incredibly professional.

Click here to view one of my dashboard which I did using jupyter notebook(for analysis) and Tableau (for visualizations).

thomasaborted.blogspot.com

Source: https://towardsdatascience.com/from-analysis-to-dashboarding-connecting-notebooks-to-tableau-483fa373f3a4

0 Response to "How to Upload File From Dropbox Into Jupyter Notebook"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel