cancel
Showing results for 
Search instead for 
Did you mean: 

Access the local file system on Azure Databricks.

Access the local file system on Azure Databricks.

Hello

I need help accesing the local file system on Azure Databricks.

I tried to use this as an example:

https://github.com/datarobot-community/examples-for-data-scientists/blob/master/Making%20Prediction/nationalcinematheque.com.ua

and get stuck when trying to access local filesystem -- in this case, dbfs.

I get the error below:

InputNotUnderstoodError: sourcedata parameter not understood. Use pandas DataFrame, file object or string that is either a path to file or raw file content to specify data to upload
 
job = dr.BatchPredictionJob.score_to_file(
deploymentId,
intake_path = 'dbfs:/pathToRequest.csv',
output_path = 'dbfs:/pathToResponse.csv',
passthrough_columns_set='all'
)

 

Labels (1)
1 Reply
bijukdr
DataRobot Alumni

Hi Jessica,

 

The supported intake options are documented here.

 

Perhaps you can mount an Azure blob storage container as a DBFS mount, write to the mount and then score directly from the blob as shown here.

 

OR

 

you could perhaps call the datarobot API to score data on DBFS as shown on this databricks example notebook.

 

If you have follow up questions, don't hesitate to ask.