jeudi 28 avril 2016

How to optimise hadle of big data on laravel?

Everyone! I have some problem. My task is: "To take transactions table, grouped row by transaction date and calculate statuses. This manipulations will be formed statistics, wich will be rendered on the page".

This is my method of this statistics generation

public static function getStatistics(Website $website = null)
    {
        if($website == null) return [];

        $query = \DB::table('transactions')->where("website_id", $website->id)->orderBy("dt", "desc")->get();

        $transitions = collect(static::convertDate($query))->groupBy("dt");
        $statistics = collect();

        dd($transitions);

        foreach ($transitions as $date => $trans) {
            $subscriptions = $trans->where("status", 'subscribe')->count();
            $unsubscriptions = $trans->where("status", 'unsubscribe')->count();
            $prolongations = $trans->where("status", 'rebilling')->count();
            $redirections = $trans->where("status", 'redirect_to_lp')->count();
            $conversion = $redirections == 0 ? 0 : ((float) ($subscriptions / $redirections));
            $earnings = $trans->sum("pay");

            $statistics->push((object)[
                "date" => $date,
                "subscriptions" => $subscriptions,
                'unsubscriptions' => $unsubscriptions,
                'prolongations' => $prolongations,
                'redirections' => $redirections,
                'conversion' => round($conversion, 2),
                'earnings' => $earnings,
            ]);

        }

        return $statistics;
    }

if count of transaction rows below 100,000 - it's all wright. But, if count is above 150-200k - nginx throw 502 bad gateway. What can you advise to me? I'm dont' have any expierince in bigdata handling. May be, my impiments has fundamental error?

Thanks.



via Chebli Mohamed

Aucun commentaire:

Enregistrer un commentaire