Skip to content

Automatic Timestamping in MySQL

Setup table schema with automatic timestamping

1
2
3
4
5
6
7
CREATE TABLE `users` (
  `id`              BIGINT UNSIGNED AUTO_INCREMENT PRIMARY KEY,
  `full_name`       VARCHAR(100) NOT NULL,
 
  `created_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
  `updated_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
)

Automatic properties are specified using the DEFAULT CURRENT_TIMESTAMP and ON UPDATE CURRENT_TIMESTAMP clauses in column definitions.

The DEFAULT CURRENT_TIMESTAMP clause assigns the current timestamp as the default value of the column.
The ON UPDATE CURRENT_TIMESTAMP clause updates the column to the current timestamp when the value of any other column in the row is changed from its current value.

CURRENT_TIMESTAMP is a synonym for NOW() and returns the current date and time. It returns a constant time that indicates the time at which a statement began to execute.

 

This was shamelessly stolen :X  I really liked their formatting, and just wanted to make a backup
https://habenamare.com/guides/mysql/automatic-timestamping/

Set up Apache Varnish for a Laravel API

I needed to cache a Laravel API due to high use.  I needed it cached before it touched PHP or MySQL, and the server I was on had to stay on Apache.  I looked at mod_cache too, but that’s a lot more suited for static websites, while this was a dynamic API.  So I decided to set up Varnish.  For this tutorial, I was using Varnish 6.2. It will store responses in memory for as long as dictated by the Cache-Control header.

This is how I went about installing and configuring Varnish. This is assuming you already have Apache set up. Varnish only works on HTTP so you first have to set it up to work through the 80 port and then use a reverse proxy so Apache handles the SSL and then passes the request to Varnish.

So first, you have to change Apache from the 80 port to 8080. Essentially you just have to change 80 to 8080 in ports.conf and your VHosts file, but here’s a simple command that’ll do it for you

1
2
sudo sed -i -e 's/80/8080/g' ports.conf 
sudo sed -i -e 's/80/8080/g' sites-available/exampl.com.conf

Then restart Apache

1
2
apachectl configtest
sudo /etc/init.d/apache2 restart

Now you’re ready to set up Varnish on port 80.  First, install and enable it

1
2
3
sudo apt install -y varnish
sudo systemctl start varnish
sudo systemctl enable varnish

Then adjust Varnish to use port 80, and increase its ram.  I’ve seen it recommended to set it to use 75% of your server’s total memory, but I set it for a more conservative 30%.

1
2
cd /etc/default/
sudo nano varnish

change uncommented

1
2
DAEMON_OPTS="-a :6081 \
     -s malloc,256m"

to

1
2
DAEMON_OPTS="-a :80 \
     -s malloc,5G"

Do essentially the same change here

1
2
cd /lib/systemd/system/
sudo nano varnish.service

change this line

1
ExecStart=/usr/sbin/varnishd -j unix,user=vcache -F -a :6081 -T localhost:6082 -f /etc/varnish/default.vcl -S /etc/varnish/secret -s malloc,256m

Just change the 6081 to 80 and 256 to 5G

Then reload the setting and restart Varnish

1
2
sudo systemctl daemon-reload 
sudo systemctl restart varnish

Now it should be all set on port 80 and caching your HTTP content! You can make sure Varnish is running with this command

1
sudo netstat -plntu

Next we gotta make it work on SSL, first install the neccesary Apache modules

1
2
sudo a2enmod proxy_balancer 
sudo a2enmod proxy_http

In the VHost, comment out the document root, and add this

1
2
3
 ProxyPreserveHost On
 ProxyPass / http://127.0.0.1:80/
 ProxyPassReverse / http://127.0.0.1:80/

This will make Apache pass the request to Varnish on port 80. And Varnish running on HTTPS is all set!

But, to get it to cache the API requests, I had to make a few changes to the Varnish configuration.

1
sudo nano /etc/varnish/default.vcl

First, add this to set a Hit/Miss header so you can easily debug the configuration.

1
2
3
4
5
6
7
8
sub vcl_deliver { #just need to add it to the existing enclosure, don't make a new one
    if (obj.hits > 0) { # Add debug header to see if it's a HIT/MISS and the number of hits, disable when not needed
      set resp.http.X-Cache = "HIT";
    } else {
      set resp.http.X-Cache = "MISS";
    }
    return (deliver);
  }

Add this inside of vcl_rev to ignore if the request has cookies. Varnish by default will not cache requests with cookies. My API used keys so this wasn’t an issue.

1
2
unset req.http.cookie; 
unset req.http.Accept-Encoding;

In vcl_backend_responce add this to unset the laravel cookie session header

1
unset beresp.http.Set-Cookie;

Another adjustment to allow purging of individual cached pages via the PURGE method works by adding this inside vcl_recv

1
2
3
4
#allow to purge saved URLs
if (req.method == "PURGE") {
return (purge);
}

That will allow you to remove any cached page using this command

1
curl -v -k -X PURGE https://url-to-purge

And to setup the headers so it’ll only cache in Varnish, and not in the browser they need to be:

1
2
Cache-Control:s-maxage=60,max-age=0,private,must-revalidate,no-cache,no-store
Surrogate-Control:*

s-maxage will set the max-age only for the server, and surrogate-control will make Varnish ignore no-cache/no-store

And, in case you haven’t done it yet, you can add a cert to your secure site using these commands

1
2
sudo certbot --apache -d example.com
sudo /etc/init.d/apache2 restart

To control the length of time that Varnish caches a page for all you need to do is set its Cache-Control header.  That essentially needs to be set to

max-age=<seconds>, public

and Varnish will do the rest.

And for me that was it!  All good!

Sources:

https://www.howtoforge.com/how-to-install-and-configure-varnish-with-apache-on-ubuntu-1804/
https://bash-prompt.net/guides/apache-varnish/
https://stackoverflow.com/questions/65013805/cant-purge-entire-domain-in-varnish-but-can-purge-individual-pages-do-i-have
https://stackoverflow.com/questions/48988162/disable-cache-on-browser-without-disabling-it-on-varnish

File Monitoring Bash Script

I wrote a very simple bash script to check and report on any php file changes in the past 24 hours, and run a simple check for any suspicious files.  It doesn’t require any software to be installed so it can be used on shared hosting with limited shell access.

It simply uses `find` to check if any php files have been changed, and report back if they have.  And uses fenrir to check for suspicious files.  Fenrir is a simple IOC scanner that checks files for specific patterns that may indicate that those files have been compromised.

The actual script is as follows, you’ll just need to swap the folders and email with the actual file locations and email

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
#!/bin/bash
 
#check for changed files
CHANGED=$(find /websitedirectory/* -name "*.php" -type f -ctime -1 | head -50)
 
if [[ ${CHANGED} == '' ]]; then
  echo "nothing has changed"
else
  echo "files changed"
  mail -s "Website files changed" your@email.com <<< "file has been changed: ${CHANGED}"
fi
 
#run fenrir
(cd /file_location/fenrir; ./fenrir.sh /websitedirectory/) &
sleep 20m
 
SYSTEM_NAME=$(uname -n | tr -d "\n")
TS_CONDENSED=$(date +%Y%m%d)
 
MATCHES=$(grep "match" /file_location/fenrir/FENRIR_${SYSTEM_NAME}_${TS_CONDENSED}.log)
 
if [[ ${MATCHES} == '' ]]; then
  echo "fennrir found nothing"
else
  echo "fenrir found bad files"
  mail -s "Fenrir found suspicious files" your@email.com <<< "Fenrir found suspicious files: ${MATCHES}"
fi

After you’ve modified the script as necessary and created the file you can set it to run daily by adding this into your crontab

1
0 0 * * * /file_location/site_monitor

Jump Cutter

I’ve been doing some video editing and one thing that really drags on is editing out any long pauses.  I did some searching on ways to cut down on the time spent on this process and found jump cutter.  It’s a simple python script that will allow you to increase the speed of any clips detected to have no sound.

It didn’t come with any installation instructions though, so it took me a while to get it going.  Below are the steps I had to follow to get it running on my instance of Ubuntu.

1
2
3
4
5
sudo apt install python3-pip 
sudo apt install ffmpeg
pip3 install image
pip3 install audiotsm
pip3 install scipy

After that, you just need to run the following, where you can change the video speed and your video location

1
python3 jumpcutter.py --input_file video_file.mp4 --silent_speed 2 --sounded_speed 1 --frame_quality 1 --frame_margin 5

For my videos, the output was giving me really bad contrast issues D:  A looot of googling later and I found what works for me.  I replaced this line

1
command = "ffmpeg -framerate "+str(frameRate)+" -i "+TEMP_FOLDER+"/newFrame%06d.jpg -i "+TEMP_FOLDER+"/audioNew.wav -strict -2 "+OUTPUT_FILE

with this

1
command = "ffmpeg -r "+str(frameRate)+" -i "+TEMP_FOLDER+"/newFrame%06d.jpg -i "+TEMP_FOLDER+"/audioNew.wav -strict -2 -crf 19 -vf eq=contrast=1 "+OUTPUT_FILE

That will increase the quality and set the contrast to its default value

Site-wide search with Laravel Scout & Sphinx

I needed to set up a simple indexer to increase site performance.  After looking at the different options, like Elastic Search, Meilisearch, TNT Search, ect.  I ended up on Sphinx.  Sphinx is a little dated, but it’s very fast and doesn’t have a lot of dependencies, since it’s written in C++.

To start I installed Sphinx

1
sudo apt-get install sphinxsearch

Then updated the config file

1
2
cd /etc/sphinxsearch/
sudo nano sphinx.conf

Here’s a very simple full-text search example

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
source database
{
  type          = mysql
 
  sql_host      = localhost
  sql_user      = user
  sql_pass      = secret
  sql_db        = database
  sql_port      = 3306 # optional, default is 3306
}
 
source src_articles : database
{
    sql_field_string = name
 
    sql_query   = SELECT id, name from articles
}
 
index articles
{
    type        = plain
    source      = src_articles
    path         = /var/lib/sphinxsearch/data/articles
 
    min_prefix_len = 3
 
    index_exact_words   = 1
    expand_keywords     = 1
 
    charset_type = utf-8
}
 
indexer
{
        mem_limit               = 256M
}
 
searchd
{
        listen                  = 9312
        listen                  = 9306:mysql41
 
        # log file, searchd run info is logged here
        # optional, default is 'searchd.log'
        log                     = /var/log/sphinxsearch/searchd.log
 
        # query log file, all search queries are logged here
        # optional, default is empty (do not log queries)
        query_log               = /var/log/sphinxsearch/query.log
 
        read_timeout        = 5
        max_children        = 30
        pid_file            = /var/run/sphinxsearch/searchd.pid
        seamless_rotate     = 1
        preopen_indexes     = 1
        unlink_old          = 1
        workers             = threads
        binlog_path         = /var/lib/sphinxsearch/data
        collation_server    = utf8_ci
        thread_stack        = 6144K
}

Then we need to start Sphinx, make sure to run it under the sphinxsearch user and run the index and create the cron to re-run the indexes once a day at midnight

1
2
3
4
5
6
7
8
sudo nano /etc/default/sphinxsearch and set START=yes
 
sudo su sphinxsearch -s /bin/bash
indexer --all
/etc/init.d/sphinxsearch start
 
sudo crontab -u sphinxsearch -e
0 0 * * * indexer --all --rotate

Then, install the plugins we need for Laravel

1
2
3
composer require laravel/scout:6.0.0
composer require constantable/laravel-scout-sphinx
php artisan vendor:publish --provider="Laravel\Scout\ScoutServiceProvider"

Then you need to add these into the providers in config/app.php

1
2
Constantable\SphinxScout\ServiceProvider::class,
Laravel\Scout\ScoutServiceProvider::class,

Alright! That should get Sphinx all set on the server, and ready to go in Laravel. In all models you want to be searchable, add this (also, they all need to have indexes defined in Sphinx settings)

1
2
3
use Laravel\Scout\Searchable;
 
use Searchable;

Now comes the controller method to intermingle the models :X This is still a WIP, scout has a very limited version of eloquent attached to it, so you can’t set the offset by hand. So this is as close as you can get with out-of-the-box Scout.

This supports a JSON version that intermingles 6 results from each source for an autocomplete field, as well as a paginated version.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
public function search(Request $request) {
        $rsx = $request->header('X-Requested-With');
 
        if (!($page = $request->get('page'))) {
            $page = 1;
        }
 
        $searchResults = [];
 
        if ($q = $request->get('q')) {
            $tables = ['article','video','book'];
 
            if ($rsx != 'XMLHttpRequest') {
                //get totals for pagination
                $total = 0;
                $totals = [];
                foreach ($tables as $k => $table) {
                    $totals[$k] = ('App\Models\\' . str_replace('_', '', ucwords($table, '_')))::search($q)->paginate(1);
                    $totals[$k] = $totals[$k]->total();
                    $total = $total + $totals[$k];
                }
 
                $pagination = new LengthAwarePaginator([], $total, $perPage);
                $pagination = str_replace('/?', '?q='.$q.'&', $pagination->render());
 
                $parts = sizeof($totals);
                foreach ($tables as $k => $table) {
                    $offset = ($page - 1) * 3;
                    if(($offset + 3) > $totals[$k]) {
                        $parts = $parts - 1;
                    }
                }
                if($parts==0)$parts=1;
                $perpart = 9 / $parts;
            }
 
            foreach($tables as $k => $table) {
                if ($rsx != 'XMLHttpRequest' && ($offset + 3) > $totals[$k] && $page != 1 && $totals[$k] != 0) {
                    continue;
                }
 
                $model = ('App\Models\\'.str_replace('_','',ucwords($table,'_')));
                $searchResults[$k] = $model::search($q);
                if ($rsx == 'XMLHttpRequest') {
                    $searchResults[$k] = $searchResults[$k]->take(6)->get()->all();
                } else {
                    $searchResults[$k] = $searchResults[$k]->paginate(floor($perpart)+($table == 'article' && is_float($perpart) ? 1 : 0))->all();
                }
 
                $searchResults[$k] = array_map(function ($i) use ($table) {$i->type = $table.'s';return $i;}, $searchResults[$k]);
            }
 
            $searchResults = $this->array_zip_merge($searchResults);
            $searchResults = array_filter($searchResults);
        }
 
        if ($rsx == 'XMLHttpRequest') {
            $rta = [];
            foreach ($searchResults as $s) {
                $rta[] = array(
                    'name' => $s['name'],
                    'date' => $s['updated_at'],
                    'link' => $s['link'],
                );
            }
            return json_encode($rta);
        }
 
        return view('search_results',['pagination'=>$pagination,'q'=>$q, 'searchResults'=>$searchResults]);
}
 
private function array_zip_merge($args) {
        $output = array();
        // The loop incrementer takes each array out of the loop as it gets emptied by array_shift().
        for ($args = $args; count($args); $args = array_filter($args)) {
            // &$arg allows array_shift() to change the original.
            foreach ($args as &$arg) {
                $output[] = array_shift($arg);
            }
        }
        return $output;
}

These resources were really helpful in getting everything set-up

https://laravel.com/docs/8.x/scout
https://www.digitalocean.com/community/tutorials/how-to-install-and-configure-sphinx-on-ubuntu-14-04
https://sam-ngu.medium.com/site-wide-search-in-laravel-with-scout-27bbdc75ae8
https://github.com/franksierra/OSMServer