database and table design for billions of rows of data

2018-10-19 05:29:46

basically i got 2 tables: header, details tables.

CREATE TABLE `header` (

`ID` int(11) NOT NULL AUTO_INCREMENT,

`RECORD_DATE` datetime DEFAULT NULL,

`TICKER_ID` int(11) DEFAULT NULL,

`CURR_TIMESTAMP` datetime DEFAULT NULL,

PRIMARY KEY (`ID`)

) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci

CREATE TABLE `detail` (

`ID` int(11) NOT NULL AUTO_INCREMENT,

`HEADER_ID` int(11) DEFAULT NULL,

`BROKER_ID` int(11) DEFAULT NULL,

`AMOUNT` decimal(26,0) DEFAULT NULL,

PRIMARY KEY (`ID`)

) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci

There're about 4800 ticker need to be update everyday, ticker_id (A00001, B00032...) and each ticker_id contain numbers of record everyday and i store in detail table

it works fine at the beginning, after years, header become 2.4 million row and detail table got 250 millions row, its take an hour to with simple select,

SELECT h.ticker_id, h.record_date , d.broker_id, d.broker_id, d.amount

FROM DETAIL