Scripting

Ramblings and links to interesting snippets


Automatic Timestamping in MySQL

Setup table schema with automatic timestamping

1
2
3
4
5
6
7
CREATE TABLE `users` (
  `id`              BIGINT UNSIGNED AUTO_INCREMENT PRIMARY KEY,
  `full_name`       VARCHAR(100) NOT NULL,
 
  `created_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
  `updated_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
)

Automatic properties are specified using the DEFAULT CURRENT_TIMESTAMP and ON UPDATE CURRENT_TIMESTAMP clauses in column definitions.

The DEFAULT CURRENT_TIMESTAMP clause assigns the current timestamp as the default value of the column.
The ON UPDATE CURRENT_TIMESTAMP clause updates the column to the current timestamp when the value of any other column in the row is changed from its current value.

CURRENT_TIMESTAMP is a synonym for NOW() and returns the current date and time. It returns a constant time that indicates the time at which a statement began to execute.

 

This was shamelessly stolen :X  I really liked their formatting, and just wanted to make a backup
https://habenamare.com/guides/mysql/automatic-timestamping/

Permalink » No comments

Set up Apache Varnish for a Laravel API

This is how I went about installing and configuring Varnish. This is assuming you already have Apache set up. Varnish only works on HTTP so you first have to set it up to work through the 80 port and then use a reverse proxy so Apache handles the SSL and then passes the request to Varnish.
So first, you have to change Apache from the 80 port to 8080. Essentially you just have to change 80 to 8080 in ports.conf and your VHosts file, but here’s a simple command that’ll do it for you

1
2
sudo sed -i -e 's/80/8080/g' ports.conf 	 	 
sudo sed -i -e 's/80/8080/g' sites-available/exampl.com.conf

Then restart Apache

1
2
apachectl configtest	 	 
sudo /etc/init.d/apache2 restart

Now you’re ready to set up Varnish on port 80. First, install and enable it

1
2
3
sudo apt install -y varnish	 	 
sudo systemctl start varnish	 	 
sudo systemctl enable varnish

Then adjust Varnish to use port 80, and increase its ram. I’ve seen it recommended to set it to use 75% of your server’s total memory, but I set it for a more conservative 30%.

1
2
cd /etc/default/	 	 
sudo nano varnish

change uncommented

1
2
DAEMON_OPTS="-a :6081 \	 	 
     -s malloc,256m"

to

1
2
DAEMON_OPTS="-a :80 \	 	 
     -s malloc,5G"

Do essentially the same change here

1
2
cd /lib/systemd/system/	 	 
sudo nano varnish.service

change this line

1
ExecStart=/usr/sbin/varnishd -j unix,user=vcache -F -a :6081 -T localhost:6082 -f /etc/varnish/default.vcl -S /etc/varnish/secret -s malloc,256m

Just change the 6081 to 80 and 256 to 5G
Then reload the setting and restart Varnish

1
2
sudo systemctl daemon-reload 	 	 
sudo systemctl restart varnish

Now it should be all set on port 80 and caching your HTTP content! You can make sure Varnish is running with this command

1
sudo netstat -plntu

Next we gotta make it work on SSL, first install the neccesary Apache modules

1
2
sudo a2enmod proxy_balancer 	 	 
sudo a2enmod proxy_http

In the VHost, comment out the document root, and add this

1
2
3
 ProxyPreserveHost On	 	 
 ProxyPass / http://127.0.0.1:80/	 	 
 ProxyPassReverse / http://127.0.0.1:80/

This will make Apache pass the request to Varnish on port 80. And Varnish running on HTTPS is all set!
But, to get it to cache the API requests, I had to make a few changes to the Varnish configuration.

1
sudo nano /etc/varnish/default.vcl

First, add this to set a Hit/Miss header so you can easily debug the configuration.

1
2
3
4
5
6
7
8
sub vcl_deliver { #just need to add it to the existing enclosure, don't make a new one	 	 
 if (obj.hits > 0) { # Add debug header to see if it's a HIT/MISS and the number of hits, disable when not needed	 	 
 set resp.http.X-Cache = "HIT";	 	 
 } else {	 	 
 set resp.http.X-Cache = "MISS";	 	 
 }	 	 
 return (deliver);	 	 
 }

Add this inside of vcl_rev to ignore if the request has cookies. Varnish by default will not cache requests with cookies. My API used keys so this wasn’t an issue.

1
2
unset req.http.cookie; 	 	 
unset req.http.Accept-Encoding;

In vcl_backend_responce add this to unset the laravel cookie session header

1
unset beresp.http.Set-Cookie;

Another adjustment to allow purging of individual cached pages via the PURGE method works by adding this inside vcl_recv

1
2
3
4
#allow to purge saved URLs	 	 
if (req.method == "PURGE") {	 	 
return (purge);	 	 
}

That will allow you to remove any cached page using this command

1
curl -v -k -X PURGE https://url-to-purge

And to setup the headers so it’ll only cache in Varnish, and not in the browser they need to be:

1
2
Cache-Control:s-maxage=60,max-age=0,private,must-revalidate,no-cache,no-store	 	 
Surrogate-Control:*

s-maxage will set the max-age only for the server, and surrogate-control will make Varnish ignore no-cache/no-store
And, in case you haven’t done it yet, you can add a cert to your secure site using these commands

1
2
sudo certbot --apache -d example.com	 	 
sudo /etc/init.d/apache2 restart

If Varnish causes issue with renewing the cert, then try using added this to your crontab

1
0 12 * * * /usr/bin/certbot renew --cert-name example.com --http-01-port 8080 --quiet

To control the length of time that Varnish caches a page for all you need to do is set its Cache-Control header. That essentially needs to be set to

1
max-age=<seconds>, public

and Varnish will do the rest.

Sources:
https://www.howtoforge.com/how-to-install-and-configure-varnish-with-apache-on-ubuntu-1804/
https://bash-prompt.net/guides/apache-varnish/
https://stackoverflow.com/questions/65013805/cant-purge-entire-domain-in-varnish-but-can-purge-individual-pages-do-i-have
https://stackoverflow.com/questions/48988162/disable-cache-on-browser-without-disabling-it-on-varnish

Permalink » No comments

Site-wide search with Laravel Scout & Sphinx

I needed to set up a simple indexer to increase site performance.  After looking at the different options, like Elastic Search, Meilisearch, TNT Search, ect.  I ended up on Sphinx.  Sphinx is a little dated, but it’s very fast and doesn’t have a lot of dependencies, since it’s written in C++.

To start I installed Sphinx

1
sudo apt-get install sphinxsearch

Then updated the config file

1
2
cd /etc/sphinxsearch/
sudo nano sphinx.conf

Here’s a very simple full-text search example

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
source database
{
  type          = mysql
 
  sql_host      = localhost
  sql_user      = user
  sql_pass      = secret
  sql_db        = database
  sql_port      = 3306 # optional, default is 3306
}
 
source src_articles : database
{
    sql_field_string = name
 
    sql_query   = SELECT id, name from articles
}
 
index articles
{
    type        = plain
    source      = src_articles
    path         = /var/lib/sphinxsearch/data/articles
 
    min_prefix_len = 3
 
    index_exact_words   = 1
    expand_keywords     = 1
 
    charset_type = utf-8
}
 
indexer
{
        mem_limit               = 256M
}
 
searchd
{
        listen                  = 9312
        listen                  = 9306:mysql41
 
        # log file, searchd run info is logged here
        # optional, default is 'searchd.log'
        log                     = /var/log/sphinxsearch/searchd.log
 
        # query log file, all search queries are logged here
        # optional, default is empty (do not log queries)
        query_log               = /var/log/sphinxsearch/query.log
 
        read_timeout        = 5
        max_children        = 30
        pid_file            = /var/run/sphinxsearch/searchd.pid
        seamless_rotate     = 1
        preopen_indexes     = 1
        unlink_old          = 1
        workers             = threads
        binlog_path         = /var/lib/sphinxsearch/data
        collation_server    = utf8_ci
        thread_stack        = 6144K
}

Then we need to start Sphinx, make sure to run it under the sphinxsearch user and run the index and create the cron to re-run the indexes once a day at midnight

1
2
3
4
5
6
7
8
sudo nano /etc/default/sphinxsearch and set START=yes
 
sudo su sphinxsearch -s /bin/bash
indexer --all
/etc/init.d/sphinxsearch start
 
sudo crontab -u sphinxsearch -e
0 0 * * * indexer --all --rotate

Then, install the plugins we need for Laravel

1
2
3
composer require laravel/scout:6.0.0
composer require constantable/laravel-scout-sphinx
php artisan vendor:publish --provider="Laravel\Scout\ScoutServiceProvider"

Then you need to add these into the providers in config/app.php

1
2
Constantable\SphinxScout\ServiceProvider::class,
Laravel\Scout\ScoutServiceProvider::class,

Alright! That should get Sphinx all set on the server, and ready to go in Laravel. In all models you want to be searchable, add this (also, they all need to have indexes defined in Sphinx settings)

1
2
3
use Laravel\Scout\Searchable;
 
use Searchable;

Now comes the controller method to intermingle the models :X This is still a WIP, scout has a very limited version of eloquent attached to it, so you can’t set the offset by hand. So this is as close as you can get with out-of-the-box Scout.

This supports a JSON version that intermingles 6 results from each source for an autocomplete field, as well as a paginated version.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
public function search(Request $request) {
        $rsx = $request->header('X-Requested-With');
 
        if (!($page = $request->get('page'))) {
            $page = 1;
        }
 
        $searchResults = [];
 
        if ($q = $request->get('q')) {
            $tables = ['article','video','book'];
 
            if ($rsx != 'XMLHttpRequest') {
                //get totals for pagination
                $total = 0;
                $totals = [];
                foreach ($tables as $k => $table) {
                    $totals[$k] = ('App\Models\\' . str_replace('_', '', ucwords($table, '_')))::search($q)->paginate(1);
                    $totals[$k] = $totals[$k]->total();
                    $total = $total + $totals[$k];
                }
 
                $pagination = new LengthAwarePaginator([], $total, $perPage);
                $pagination = str_replace('/?', '?q='.$q.'&', $pagination->render());
 
                $parts = sizeof($totals);
                foreach ($tables as $k => $table) {
                    $offset = ($page - 1) * 3;
                    if(($offset + 3) > $totals[$k]) {
                        $parts = $parts - 1;
                    }
                }
                if($parts==0)$parts=1;
                $perpart = 9 / $parts;
            }
 
            foreach($tables as $k => $table) {
                if ($rsx != 'XMLHttpRequest' && ($offset + 3) > $totals[$k] && $page != 1 && $totals[$k] != 0) {
                    continue;
                }
 
                $model = ('App\Models\\'.str_replace('_','',ucwords($table,'_')));
                $searchResults[$k] = $model::search($q);
                if ($rsx == 'XMLHttpRequest') {
                    $searchResults[$k] = $searchResults[$k]->take(6)->get()->all();
                } else {
                    $searchResults[$k] = $searchResults[$k]->paginate(floor($perpart)+($table == 'article' && is_float($perpart) ? 1 : 0))->all();
                }
 
                $searchResults[$k] = array_map(function ($i) use ($table) {$i->type = $table.'s';return $i;}, $searchResults[$k]);
            }
 
            $searchResults = $this->array_zip_merge($searchResults);
            $searchResults = array_filter($searchResults);
        }
 
        if ($rsx == 'XMLHttpRequest') {
            $rta = [];
            foreach ($searchResults as $s) {
                $rta[] = array(
                    'name' => $s['name'],
                    'date' => $s['updated_at'],
                    'link' => $s['link'],
                );
            }
            return json_encode($rta);
        }
 
        return view('search_results',['pagination'=>$pagination,'q'=>$q, 'searchResults'=>$searchResults]);
}
 
private function array_zip_merge($args) {
        $output = array();
        // The loop incrementer takes each array out of the loop as it gets emptied by array_shift().
        for ($args = $args; count($args); $args = array_filter($args)) {
            // &$arg allows array_shift() to change the original.
            foreach ($args as &$arg) {
                $output[] = array_shift($arg);
            }
        }
        return $output;
}

These resources were really helpful in getting everything set-up

https://laravel.com/docs/8.x/scout
https://www.digitalocean.com/community/tutorials/how-to-install-and-configure-sphinx-on-ubuntu-14-04
https://sam-ngu.medium.com/site-wide-search-in-laravel-with-scout-27bbdc75ae8
https://github.com/franksierra/OSMServer

Permalink » No comments

Merge Excel Sheets with Laravel Excel

Laravel Excel allows you to create and format Excel documents with PHP within the Laravel framework.  I specifically needed to merge Excel sheets, and not just tack on the sheets to one document, but actually merge several files into a single sheet.  It’s pretty straightforward, but the documentation leaves a bit to be desired so it took me a while to narrow down the right way to do this.  The only downside is it will remove any formatting from the sheets.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
//first pull all the rows
$rows = [];
Excel::batch(storage_path('FOLDER/CONTAINING/EXCEL/FILES'), function($rows1, $file) use($rows) {
    $rows1->each(function($row) use($rows1) {
        global $rows;
 
        $rows[] = $row->toArray();
    });
 
});
 
//then add them to a new file
Excel::create('new_filename', function ($excel) use ($rows) {
    $excel->sheet('Sheet', function ($sheet) use ($rows) {
        global $rows;
 
        // Sheet manipulation
        foreach($rows as $row) {
            $sheet->appendRow($row);
        }
    });
})->store('xls', storage_path('excel/exports'));

Permalink » No comments

Compare CSV files in Bash

I’ve been needing this forever!  I have huge CSV lists and have needed a way to quickly compare them.  For this I just need to make sure that the first field from one CSV does not exist anywhere in the second CSV.  It was actually pretty simple to write this in Bash and it runs really fast!

1
2
3
4
5
6
while IFS=, read -r field1 field2
  do
    if grep --F "$field1" list1.csv; then
      echo "$field1 found"
    fi
done < list2.csv

Permalink » No comments