Last Updated
Viewed 126 Times

I want to generate the TPC-DS data (1 TB and 10 TB) directly in AWS S3 without transferring from local machine to s3. What is the easiest way to do that?

For a POC, we need 10 GB of data to be available in Oracle RDS instance. Any test data is ok ( like TPC for benchmarking ) , for this requirement is there any specific way to create database and pump the sample data ?

I have downloaded the DSGEN tool from the TPC-DS web site and already generated the tables and loaded the data into Oracle XE.

I am using the following command to generate the SQL statements :

dsqgen -input ..\query_templates\templates.lst -directory ..\query_templates -dialect oracle -scale 1

However, No matter how I adjust the command I always get this error message :

ERROR: A query template list must be supplied using the INPUT option

Can anybody help?

I have developed a cloud storage system that uses the same API structure as Amazon S3. Now I want to run some performance tests on getting object data and object metadata. In such a way that I can compare my system with Amazon S3, OpenStack storage and other systems.

I have looked at some common file system benchmark tools, there is too much work to convert them for Cloud Storage systems.

I am looking for some benchmark tools similar to SIEGE, that not only can performance http requests, but also have some workload simulation features. For example, one simulation can be storing an entire static HTML website in the Cloud Storage then performance some workload stress test etc.

Can someone help and suggest some existing framework or tools that can be relatively easy to be fit for such cloud storage system benchmark scenario?

Similar Question 4 (1 solutions) : Benchmarks similar to TPC-H and TPC-DS

Similar Question 5 (1 solutions) : How Data Transfer is calculatedin AWS S3?

Similar Question 6 (5 solutions) : How to Generate a Presigned S3 URL via AWS CLI

Similar Question 7 (1 solutions) : How to generate an AWS S3 Pre-Signed URL