Hello everyone,

I would greatly appreciate if you help me to choose database/architecture for massive-insert data.

Here are details:

- there will be 2 tables (i guess with timestamp-based partitioning)
- 1st table: 6 columns, 3 of them need to be indexed. (timestamp, string, string). 2nd table: 9 columns, indexing and data types as in first one.
- expected 6 millions of records per second(total for both tables).
- there will be no updates. reads will be rare (maybe 10-20 easy selects per hour)
- for performance i guess there will be needed bulk inserts in unindexed partitions with later indexing of each table partition. and the indexing should more or less keep up with data insertion.

The question is: which database and architecture can handle such data? i guess cluster and load balancing will be needed. But which DB can achieve such performance? For example, is postgresql cluster capable? Or any other solution?
I know much depends on hardware, but first i need to choose db and architecure.
Some hardware advice would be usefull as well.

Please give some advices.

Best Regards,