During budgeting season, we help you control costs: query cost estimation, column details, Snowflake and more!
November has started. 2 months left before the end of the year. Far or close? I'll let you decide.
For us, we feel it's both and we will try to make it count.
It means 4 dev cycles/sprints/iterations/call-them-whatever-you-like but also multiple releases with killer features. But more on that later (later like at the bottom of this update 😉)
And for most of you, end of the year is often the time for budgeting.
The context being what it is, we provide you with the tools to improve your visibility on costs.
Google charges BigQuery usage by the number of bytes processed by a specific query. Current pricing is $6.5 per TB with the first TB being free each month.
From now one, all BigQuery data query cells will mention the estimated bytes processed by the query before you run it.
It helps you make sure you don't SELECT * when you don't need to. As a reminder, columnar data warehouse would scan data... column by column 🤓. Meaning that SELECT * costs more than an easier SELECT id.
Then, sometimes it's not enough.
Admins are now able to set warning before running expensive queries. What does that mean? If the estimated processed bytes go over a defined threshold then a warning is displayed to the query writer before running the query!
🛣️ Want this feature to be available for another data warehouse? Reply to this message to bump it up on the roadmap!
🗒️ And if you need more costs control, we can send (nice-looking Husprey) notebooks with more information about your usage. All you need is to get in touch with us in our Chat.
You knew that you could display detailed information about your columns? You knew, right? Right? If you didn't know, it's available when clicking on a column from the Documentation Panel.
In fact, we heard great feedback about this (which makes our team happy btw).
But (there is often a but), some teams feel they don't want this feature in order to limit data warehouse load and costs. Okay, fine.
As an Admin, right from your Settings, you can now deactivate this feature if you feel this is not for you.
N.B.: Just as a side note, as much as we can, we used built-in data warehouses functions to compute those details in order to reduce induced costs.
Receive similar content twice a month, along with the latest data news, tips and attractive European job offers.
Snowflake is not known for being easy on pricing. Yet, you can configure the warehouse size you want to use and therefore control your costs.
In Husprey, you have 2 ways to control pricing.
First you can ask for "Personal Credentials" data sources. You knew about Personal Credentials, right? Right? Read more on personal credentials.
In Husprey, you can let any analysts query the warehouse using their own credentials. This is an advanced feature that we activate in your workspace on demand.
When running a query in Snowflake, we would use the current user default warehouse allowing to switch on-the-go, directly from Snowflake settings.
But admins can do more than that.
At any time you can head to your Settings, edit your Snowflake connection and pick a warehouse. This would be the one used by default.
In fact, a part from those highlighted features, we continuously ship and we will be better at communicating those features with you on a regular basis.
Like New Year resolutions but in November 🤓
In October, we released many performance fixes, new chart types (Horizontal Bar, Scatter Plot), the ability to refresh your Data Model on a regularly basis, new illustrations, design fixes, ...
And we started working on the future 🤯🚀
Before the end of the year, expect us to release multiple great improvements / new features
No public sneak peak (yet). But if you want to have access early, ping us. We would be glad to let you in on this closed club 🤩
If you haven't signed up to Husprey yet, start your two-week free trial now!