mysql - How to improve wind data SQL query performance -


i'm looking on how optimize (if possible) performance of sql query used reading wind information (see below) changing e.g. database structure, query or else?

i use hosted database store table more 800,000 rows wind information (speed , direction). new data added each minute anemometer. database accessed using php script creates web page plotting data using google's visualization api.

the web page takes approximately 15 seconds load. i've added time measurements in both php , javascript part profile code , find possible areas improvements.

one part hope improve following query takes approximately 4 seconds execute. purpose of query group 15 minutes of wind speed (min/max/mean) , calculate mean value , total min/max during period of measurements.

select  avg(d_mean) group_mean,          max(d_max) group_max,          min(d_min)         group_min,          dir,          from_unixtime(max(dt),'%y-%m-%d %h:%i') group_dt     (     select  @i:=@i+1,              floor(@i/15) group_id,              cast(mean decimal(3,1)) d_mean,              cast(min decimal(3,1)) d_min,              cast(max decimal(3,1)) d_max,              dir,              unix_timestamp(str_to_date(dt, '%y-%m-%d %h:%i')) dt              table, (select @i:=-1) var_init              order id desc ) t  group group_id limit 0, 360  ...  $oresult = mysql_query($ssql); 

the table has following structure:

1   id      int(11)     auto_increment 2   mean    varchar(5)  utf8_general_ci 3   max     varchar(5)  utf8_general_ci 4   min     varchar(5)  utf8_general_ci 5   dt      varchar(20) utf8_general_ci    // date , time 6   dir     varchar(5)  utf8_general_ci 

the following setup used:

  • database: mariadb, 5.5.42-mariadb-1~wheezy
  • database client version: libmysql - 5.1.66
  • php version: 5.6
  • php extension: mysqli

i agree comments far -- cleanse data put table.

once have done cleansing, let's avoid subquery doing...

select  min(dt) 'start of 15 mins',         format(avg(mean), 1) 'avg wind speed',         ...     table     group floor(unix_timestamp(dt) / 900)     order floor(unix_timestamp(dt) / 900); 

i don't understand purpose of limit. i'll guess want few days @ time. that, recommend add (after cleansing) between from , group by.

    dt >= '2015-04-10'       , dt  < '2015-04-10' + interval 7 day 

that show 7 days, starting '2015-04-10' morning.

in order handle table of 800k, decidedly need (again, after cleansing):

index(dt) 

to cleanse 800k rows, there multiple approaches. suggest creating new table, copy data in, test, , swap over. like...

create table new (     dt datetime,      mean float,     ...     primary key(dt)  -- assuming have 1 row per minute? ) engine=innodb;  insert new (dt, mean, ...)     select str_to_date(...),            mean, -- suspect cast not needed            ...; 

write new select , test it.

by new missing newer rows. can either rebuild , hope finish in 1 minute window, or play other game. let know if want there.


Comments

Popular posts from this blog

c++ - No viable overloaded operator for references a map -

java - Custom OutputStreamAppender not run: LOGBACK: No context given for <MYAPPENDER> -

java - Cannot secure connection using TLS -