Web Service Power BI Refresh Error When Trying to Refresh On Premises 32-bit Database/Source
I recently worked with a client who is still using an old 32-bit Oracle instance with lots of dependencies. Upgrading is not currently an option. In her Power BI report/model, she is querying several tables in the Oracle database. Unfortunately due to the 32-bit architecture she was not able to schedule her refreshes, even using the desktop gateway.
The Power BI refresh error she received was “Failed to update data source credentials: An error happened while reading data from the provider: ‘Attempt to load Oracle client libraries threw BadImageFormatException. This problem will occur when running in 64 bit mode with the 32 bit Oracle client components installed.'”
As a result, she found she could only refresh this report using the Power BI desktop client. She would then manually publish to the cloud. This was cumbersome as the data refresh took quite some time, and she needed it done daily. I came up with two solutions for her, both of which we implemented and tried. One proved to better for us, and ultimately she is successfully using this first solution/workaround that I will cover.
Solution #1 – Export Query Results to CSV and Post on SharePoint
- So just as the title of the section states, the final workaround was for us to automate the generation of CSV files to ultimately be consumed by Power Query.
- We did this by using a combination of scripts pulling data via SQL in VBA (Access, Excel) and via SQL in Python. You can pull the data any way you are most comfortable (macros, etc.).
- The query results are then outputted to the CSV files, which are written directly to SharePoint folders.
- After all this was set up and the initial files were changed, we then changed the source of the data in Power Query, to now navigate to SharePoint folders and then process the CSV files.
- It is important to note that you are not required to use CSV files. These made sense in our case due to the amount of data we were working with (generating several very large files nightly). You can write to any supported file format if that works better in your case, as long as you can place it on SharePoint and Power Query can process it.
- Also as noted we used a separate folder for each unique query. If you are also replacing several queries, you can either do this (write to separate folders) or if you would prefer just one, then utilize different naming convention patterns to filter.
- Everything is then kicked off via a command line batch script (.bat) using the Windows task scheduler on her desktop to run overnight. We also set up the same script on a colleague’s PC as a backup/for redundancy in case something were to happen, but it is not actively running.
- After refreshing and publishing the updated report with the new SharePoint sources, I then went into settings in the Web Service and added the credentials there for SharePoint. I saved and kicked off the refresh in the browser.
- The refresh was successful, so I then set a daily schedule for the web service to refresh automatically. No intervention has been needed since implementation!
Solution #2 – Automate the Power BI Desktop Refresh Process Using Python
- This method is much less convenient in my opinion, as after it had been implemented, it failed often and required manual intervention several times before we moved on to the other solution.
- Another caveat is that this solution requires the user to have at least some familiarity with Python (or finding someone with Python knowledge to tap as a support resource).
- Also, obviously someone needs to install Python, the required package and then set up the automation to run the scripts.
- Important note – it is possible to install a portable version of Python if you do not have admin rights on the desktop.
- The package is called pbixrefresher (repo is linked).
- This gentleman does a great job of walking through setting it up in his YouTube video about it: https://youtu.be/wtXXpicCUlY
I hope this post has been helpful and has maybe given you some ideas on how to proceed if you or an organization you are supporting is facing this same situation with older data sources. Also, I’d love to hear about other solutions if you’ve faced the same issue and handled it differently.
Please reach out with any questions or if you would like to see further coverage of any of the concepts/steps mentioned in this post. I am also available on a limited basis for consulting as my schedule is full as of late (a good thing!). Just fill out the contact form and I will get back to you.