Skip to content

File Monitoring Bash Script

I wrote a very simple bash script to check and report on any php file changes in the past 24 hours, and run a simple check for any suspicious files.  It doesn’t require any software to be installed so it can be used on shared hosting with limited shell access.

It simply uses `find` to check if any php files have been changed, and report back if they have.  And uses fenrir to check for suspicious files.  Fenrir is a simple IOC scanner that checks files for specific patterns that may indicate that those files have been compromised.

The actual script is as follows, you’ll just need to swap the folders and email with the actual file locations and email

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
#!/bin/bash
 
#check for changed files
CHANGED=$(find /websitedirectory/* -name "*.php" -type f -ctime -1 | head -50)
 
if [[ ${CHANGED} == '' ]]; then
  echo "nothing has changed"
else
  echo "files changed"
  mail -s "Website files changed" your@email.com <<< "file has been changed: ${CHANGED}"
fi
 
#run fenrir
(cd /file_location/fenrir; ./fenrir.sh /websitedirectory/) &
sleep 20m
 
SYSTEM_NAME=$(uname -n | tr -d "\n")
TS_CONDENSED=$(date +%Y%m%d)
 
MATCHES=$(grep "match" /file_location/fenrir/FENRIR_${SYSTEM_NAME}_${TS_CONDENSED}.log)
 
if [[ ${MATCHES} == '' ]]; then
  echo "fennrir found nothing"
else
  echo "fenrir found bad files"
  mail -s "Fenrir found suspicious files" your@email.com <<< "Fenrir found suspicious files: ${MATCHES}"
fi

After you’ve modified the script as necessary and created the file you can set it to run daily by adding this into your crontab

1
0 0 * * * /file_location/site_monitor

Jump Cutter

I’ve been doing some video editing and one thing that really drags on is editing out any long pauses.  I did some searching on ways to cut down on the time spent on this process and found jump cutter.  It’s a simple python script that will allow you to increase the speed of any clips detected to have no sound.

It didn’t come with any installation instructions though, so it took me a while to get it going.  Below are the steps I had to follow to get it running on my instance of Ubuntu.

1
2
3
4
5
sudo apt install python3-pip 
sudo apt install ffmpeg
pip3 install image
pip3 install audiotsm
pip3 install scipy

After that, you just need to run the following, where you can change the video speed and your video location

1
python3 jumpcutter.py --input_file video_file.mp4 --silent_speed 2 --sounded_speed 1 --frame_quality 1 --frame_margin 5

For my videos, the output was giving me really bad contrast issues D:  A looot of googling later and I found what works for me.  I replaced this line

1
command"ffmpeg -framerate "+str(frameRate)+" -i "+TEMP_FOLDER+"/newFrame%06d.jpg -i "+TEMP_FOLDER+"/audioNew.wav -strict -2 "+OUTPUT_FILE

with this

1
command = "ffmpeg -r "+str(frameRate)+" -i "+TEMP_FOLDER+"/newFrame%06d.jpg -i "+TEMP_FOLDER+"/audioNew.wav -strict -2 -crf 19 -vf eq=contrast=1 "+OUTPUT_FILE

That will increase the quality and set the contrast to its default value

Site-wide search with Laravel Scout & Sphinx

I needed to set up a simple indexer to increase site performance.  After looking at the different options, like Elastic Search, Meilisearch, TNT Search, ect.  I ended up on Sphinx.  Sphinx is a little dated, but it’s very fast and doesn’t have a lot of dependencies, since it’s written in C++.

To start I installed Sphinx

1
sudo apt-get install sphinxsearch

Then updated the config file

1
2
cd /etc/sphinxsearch/
sudo nano sphinx.conf

Here’s a very simple full-text search example

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
source database
{
  type          = mysql
 
  sql_host      = localhost
  sql_user      = user
  sql_pass      = secret
  sql_db        = database
  sql_port      = 3306 # optional, default is 3306
}
 
source src_articles : database
{
    sql_field_string = name
 
    sql_query   = SELECT id, name from articles
}
 
index articles
{
    type        = plain
    source      = src_articles
    path         = /var/lib/sphinxsearch/data/articles
 
    min_prefix_len = 3
 
    index_exact_words   = 1
    expand_keywords     = 1
 
    charset_type = utf-8
}
 
indexer
{
        mem_limit               = 256M
}
 
searchd
{
        listen                  = 9312
        listen                  = 9306:mysql41
 
        # log file, searchd run info is logged here
        # optional, default is 'searchd.log'
        log                     = /var/log/sphinxsearch/searchd.log
 
        # query log file, all search queries are logged here
        # optional, default is empty (do not log queries)
        query_log               = /var/log/sphinxsearch/query.log
 
        read_timeout        = 5
        max_children        = 30
        pid_file            = /var/run/sphinxsearch/searchd.pid
        seamless_rotate     = 1
        preopen_indexes     = 1
        unlink_old          = 1
        workers             = threads
        binlog_path         = /var/lib/sphinxsearch/data
        collation_server    = utf8_ci
        thread_stack        = 6144K
}

Then we need to start Sphinx, make sure to run it under the sphinxsearch user and run the index and create the cron to re-run the indexes once a day at midnight

1
2
3
4
5
6
7
8
sudo nano /etc/default/sphinxsearch and set START=yes
 
sudo su sphinxsearch -s /bin/bash
indexer --all
/etc/init.d/sphinxsearch start
 
sudo crontab -u sphinxsearch -e
0 0 * * * indexer --all --rotate

Then, install the plugins we need for Laravel

1
2
3
composer require laravel/scout:6.0.0
composer require constantable/laravel-scout-sphinx
php artisan vendor:publish --provider="Laravel\Scout\ScoutServiceProvider"

Then you need to add these into the providers in config/app.php

1
2
Constantable\SphinxScout\ServiceProvider::class,
Laravel\Scout\ScoutServiceProvider::class,

Alright! That should get Sphinx all set on the server, and ready to go in Laravel. In all models you want to be searchable, add this (also, they all need to have indexes defined in Sphinx settings)

1
2
3
use Laravel\Scout\Searchable;
 
use Searchable;

Now comes the controller method to intermingle the models :X This is still a WIP, scout has a very limited version of eloquent attached to it, so you can’t set the offset by hand. So this is as close as you can get with out-of-the-box Scout.

This supports a JSON version that intermingles 6 results from each source for an autocomplete field, as well as a paginated version.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
public function search(Request $request) {
        $rsx = $request->header('X-Requested-With');
 
        if (!($page = $request->get('page'))) {
            $page = 1;
        }
 
        $searchResults = [];
 
        if ($q = $request->get('q')) {
            $tables = ['article','video','book'];
 
            if ($rsx != 'XMLHttpRequest') {
                //get totals for pagination
                $total = 0;
                $totals = [];
                foreach ($tables as $k => $table) {
                    $totals[$k] = ('App\Models\\' . str_replace('_', '', ucwords($table, '_')))::search($q)->paginate(1);
                    $totals[$k] = $totals[$k]->total();
                    $total = $total + $totals[$k];
                }
 
                $pagination = new LengthAwarePaginator([], $total, $perPage);
                $pagination = str_replace('/?', '?q='.$q.'&', $pagination->render());
 
                $parts = sizeof($totals);
                foreach ($tables as $k => $table) {
                    $offset = ($page - 1) * 3;
                    if(($offset + 3) > $totals[$k]) {
                        $parts = $parts - 1;
                    }
                }
                if($parts==0)$parts=1;
                $perpart = 9 / $parts;
            }
 
            foreach($tables as $k => $table) {
                if ($rsx != 'XMLHttpRequest' && ($offset + 3) > $totals[$k] && $page != 1 && $totals[$k] != 0) {
                    continue;
                }
 
                $model = ('App\Models\\'.str_replace('_','',ucwords($table,'_')));
                $searchResults[$k] = $model::search($q);
                if ($rsx == 'XMLHttpRequest') {
                    $searchResults[$k] = $searchResults[$k]->take(6)->get()->all();
                } else {
                    $searchResults[$k] = $searchResults[$k]->paginate(floor($perpart)+($table == 'article' && is_float($perpart) ? 1 : 0))->all();
                }
 
                $searchResults[$k] = array_map(function ($i) use ($table) {$i->type = $table.'s';return $i;}, $searchResults[$k]);
            }
 
            $searchResults = $this->array_zip_merge($searchResults);
            $searchResults = array_filter($searchResults);
        }
 
        if ($rsx == 'XMLHttpRequest') {
            $rta = [];
            foreach ($searchResults as $s) {
                $rta[] = array(
                    'name' => $s['name'],
                    'date' => $s['updated_at'],
                    'link' => $s['link'],
                );
            }
            return json_encode($rta);
        }
 
        return view('search_results',['pagination'=>$pagination,'q'=>$q, 'searchResults'=>$searchResults]);
}
 
private function array_zip_merge($args) {
        $output = array();
        // The loop incrementer takes each array out of the loop as it gets emptied by array_shift().
        for ($args = $args; count($args); $args = array_filter($args)) {
            // &$arg allows array_shift() to change the original.
            foreach ($args as &$arg) {
                $output[] = array_shift($arg);
            }
        }
        return $output;
}

These resources were really helpful in getting everything set-up

https://laravel.com/docs/8.x/scout
https://www.digitalocean.com/community/tutorials/how-to-install-and-configure-sphinx-on-ubuntu-14-04
https://sam-ngu.medium.com/site-wide-search-in-laravel-with-scout-27bbdc75ae8
https://github.com/franksierra/OSMServer

Merge Excel Sheets with Laravel Excel

Laravel Excel allows you to create and format Excel documents with PHP within the Laravel framework.  I specifically needed to merge Excel sheets, and not just tack on the sheets to one document, but actually merge several files into a single sheet.  It’s pretty straightforward, but the documentation leaves a bit to be desired so it took me a while to narrow down the right way to do this.  The only downside is it will remove any formatting from the sheets.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
//first pull all the rows
$rows = [];
Excel::batch(storage_path('FOLDER/CONTAINING/EXCEL/FILES'), function($rows1, $file) use($rows) {
    $rows1->each(function($row) use($rows1) {
        global $rows;
 
        $rows[] = $row->toArray();
    });
 
});
 
//then add them to a new file
Excel::create('new_filename', function ($excel) use ($rows) {
    $excel->sheet('Sheet', function ($sheet) use ($rows) {
        global $rows;
 
        // Sheet manipulation
        foreach($rows as $row) {
            $sheet->appendRow($row);
        }
    });
})->store('xls', storage_path('excel/exports'));

Install a Magento extension from SSH

Magento is a popular open-source eCommerce platform, which has a lot of extensions to add to its basic functionality.  It has a method for installing extensions in its control panel, but in my experience, it hasn’t been very reliable.  So, I usually just install extensions from the shell.  And, this is my quick cheat sheet on how you can install an extension in the shell.

  1. cd into your Magento directory
  2. Get the component name for composer, in the Magento marketplace it will be located under “My Profile > My Purchases”
  3. composer require component/name
  4. run “bin/magento module:status” to get the module’s name, it’ll be under “List of disabled modules:”
  5. bin/magento module:enable ModuleName
  6. bin/magento setup:upgrade
  7. bin/magento setup:di:compile
  8. bin/magento cache:clean
  9. php bin/magento setup:static-content:deploy -f
  10. There’s a known issue with Magento that it often won’t generate a “js-translation.json” file and that will break its back-end, but you can simply create one with empty JSON to fix the issue
  11. nano pub/static/adminhtml/Magento/backend/en_US/js-translation.json
  12. []

And you’re done!  If everything went off without a hitch you should see that extension working in your Magento control panel.