logo
Tags down

shadow

From S3 to Snowflake and performance


By : Not
Date : September 16 2020, 04:00 AM
To fix this issue Snowflake Has some documentation that should answer this pretty well here. In short: Ideally your files are big, but not so big and/or complicated that they take more than a minute to process.
I have some snowpipes that handle a lot of small files without much trouble, but you'll probably get at least slightly better performance out of larger files. In general, SnowFlake is optimized for larger batches.
code :


Share : facebook icon twitter icon

Snowflake - Performance when column size is not specified


By : Koo Boon Yao
Date : March 29 2020, 07:55 AM
it helps some times Actually, we're just about to add more info about it to our doc, coming very soon.
In short, the length for VARCHAR and precision ("15" in DECIMAL(15,2)for DECIMAL/NUMBER only work as constraints, and have no effect on performance. Snowflake automatically will detect the range of values and optimize storage and processing for it. The scale ("2" in DECIMAL(15,2)) for NUMBER and TIMESTAMP can influence storage and performance size though.

traditional star schema vs wide-table performance comparison in Snowflake


By : user3463003
Date : March 29 2020, 07:55 AM
will help you The answer depends on your specific situation to some degree. When designing the schema, you typically have to balance the ease/speed/recoverability of ingesting data from many different sources/tables w/ a model that is easy for consumers to understand (e.g., write complex analytical queries) and performs well under load.
I've found that maintaining the core data model in star/snowflake format enables independent ingest/transformation/conforming of all the corresponding fact & dimension tables.

How to migrate data from one snowflake instance to another, best performance option?


By : user3512412
Date : March 29 2020, 07:55 AM
wish helps you You could always replicate the data from USWEST to USEAST. That's probably the easiest way, especially if you are only concerned with the data.
https://docs.snowflake.net/manuals/user-guide/database-replication-failover.html

in Snowflake, Does resize an existing warehouse helps in improving the performance of a running query?


By : Mitarohmar
Date : March 29 2020, 07:55 AM
hope this fix your issue Resizing a running warehouse does not impact queries that are already being processed by the warehouse; the additional servers are only used for queued and new queries.
https://docs.snowflake.net/manuals/user-guide/warehouses-considerations.html#scaling-up-vs-scaling-out

Why these 2 similar queries in Snowflake have very different performance?


By : Aleks Gorbenko
Date : August 20 2020, 12:56 PM
I wish did fix the issue. in the second query you can see bytes spilled to local storage is 272gb. This means that the work done in processing was too large to fit in the cluster memory and so had to spill to local attached SSD. From a performance perspective this is a costly operation and I think probably why the 2nd query took so long to run (query 1 only had 2gb of spilling). The easiest solution to this is to increase the size of the VDW - or you could rewrite the query:
https://docs.snowflake.net/manuals/user-guide/ui-query-profile.html#queries-too-large-to-fit-in-memory
shadow
Privacy Policy - Terms - Contact Us © 35dp-dentalpractice.co.uk