RoR


Send Rails Email from console

I’ve needed to do this forever, and finally found a stack exchange with a nice quick way to send mail from rails console, that works with the version of rails I’m on.

1
2
3
4
5
6
7
8
9
10
11
ActionMailer::Base.delivery_method = :smtp 
ActionMailer::Base.smtp_settings = {
  address: 'smtp.gmail.com', 
  port: 587, 
  domain: 'gmail.com',
  authentication: 'plain', 
  enable_starttls_auto: true, 
  user_name: 'you@gmail.com',
  password: 'yourpassword'
}
MailNotifier.activation_instructions(@user).deliver

Permalink » No comments

Global Replace Content in RoR

I think it’s fairly typical to have to make a global replace on your database at some time or another – updating old links, old code, ect.  And today this was something I was tasked with!  It was pretty straight forward, and these two stackoverflow questions helped me do it.  To get this working you should just need to follow the commands below, while replacing the following:

  • TABLE – The database table you’re replacing content in
  • COLUMN – The column in that table you’re replacing content in
  • BAD – The old string you want to replace
  • GOOD – The new string you want to replace BAD with
1
2
3
4
5
6
7
8
9
10
rails c
 
ActiveRecord::Base.connection.execute(%q{
    update TABLE
    set COLUMN = replace(
        COLUMN,
        'BAD',
        'GOOD'
    )
})

Checking for a string

You could also use the following code to check whether or not the string exists in any entries in that model in the first place. Just swap out MODEL for the model, SEARCH for the string you want to check, and COLUMN for the column you want to search. It will output the total number of entries containing that string, and then loop through their ID numbers.

1
2
3
4
5
6
7
8
9
10
11
rails c
 
a = MODEL.where("COLUMN like ?", "%SEARCH%");0
 
puts a.length
 
puts " "
 
a.all.each do |a|
  puts a.id
end;0

Permalink » No comments

Export table in rails console

This is just a small twist on the very nice snippet I found from this post.  I just changed it slightly, so you can get all the attributes automatically so you can easily swap out any model.  And it will also list the column names in the first row.

1
2
3
4
5
6
7
8
9
10
11
12
require 'csv'
 
file = "#{Rails.root}/public/data.csv"
 
table = User.all;0
 
CSV.open( file, 'w' ) do |writer|
  writer << table.first.attributes.map { |a,v| a }
  table.each do |s|
    writer << s.attributes.map { |a,v| v }
  end
end

Permalink » No comments

Compress images with simple smusher

I wrote a similar post using TinyPNG about a week ago, but the 500 image limit was a bit small so I built my own very simple image compressor.  So I’ve updated the code to work with my simple smusher, added a few improvements, and I have a PHP version as well.

Ruby Version

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
require 'open-uri'
require "json"
require "net/http"
require "uri"
require 'rubygems'
 
start = (ARGV[0] ? ARGV[0] : 0).to_i
last = start+100
i = 0
total_savings = 0
Dir.glob("public/system/rich/rich_files/rich_files/**/*.{jpg,png}") do |item|
  #only do 100 images at a time
  break if i == last
 
  #ignorre directories and original files
  next if item == '.' or item == '..' or item.include? "original"
  i = i + 1
  next if i <= start
 
  url = item.sub 'public', 'http://www.jimstoppani.com'
  uri = URI.parse("http://yourdomain.com/?i=#{url}")
  http = Net::HTTP.new(uri.host, uri.port)
  http.read_timeout = 500
  request = Net::HTTP::Get.new(uri.request_uri)
  response = http.request(request)
 
  if response.code == "200"
    result = JSON.parse(response.body)
  else
    break
  end
 
  puts "Converted: #{item}"
 
  if result[0]['dest_size'] < result[0]['src_size']
    open(item, 'wb') do |file|
      file << open(result[0]["dest"]).read
    end
    puts "Original: #{result[0]['src_size']} Optimized #{result[0]['dest_size']}"
    total_savings = total_savings + (result[0]['src_size']-result[0]['dest_size'])
  else
    puts "no size difference"
  end
end
 
puts "All done! Total Savings: #{total_savings}"

This one has a few advantages over the last version – it needs a variable sent with it so it doesn’t run into the problem of re-compressing images that have already been compressed.  It does 100 images at once, so you could run it in a loop like “ruby tinify.rb 0″, “ruby tinify.rb 100″, “ruby tinify.rb 200″ ect.

It also checks the response from the compressor, and it there’s no file size savings it doesn’t copy over the new version. And once it’s done, it will tell you the total savings.

PHP Version

HTML:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
    <title>Photo Smusher</title>
    <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js"></script>
 
    <script>
        $(function(){
        var total = 0;
        var start = 0;
 
 
        function smush_it() {
            var jqxhr = $.getJSON("smush.php?img=FOLDER&p="+start, function(data) {
              total = total + parseFloat(data[0].total);
              start = start+5;
              $("#total").text(total);
              $("#start").text(start);
              if(start < 5010) {
                  smush = setTimeout(function(){smush_it()},60000);
              }
            }).fail(function( jqxhr, textStatus, error ) {
                var err = start + ' ' + textStatus + ", " + error;
                smush = setTimeout(function () {
                    smush_it()
                }, 60000);
                $("#error").text(err);
            });
        }
 
        $("#total").text(total);
        $("#start").text(start);
        smush_it();
 
        });
    </script>
</head>
 
<body>
    <p>Progress: <span id="start"></span></p>
    <p>KB saved: <span id="total"></span></p>
    <p id="error"></p>
</body>
</html>

PHP:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
<?php
function loadFile($url) {
    $ch = curl_init();
 
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($ch, CURLOPT_URL, $url);
 
    $data = curl_exec($ch);
    curl_close($ch);
 
    return $data;
}
 
 
function downloadfile($file, $path, $fn) {
    if(isset($file) && isset($path) && isset($fn)) {
        $fc = implode('', file($file));
 
        $mode = (file_exists($path . $fn) ? 'w': 'x+');
        $Files = fopen($path . $fn, $mode);
 
        if ((fwrite($Files, $fc)) != 0){
            fclose($Files);
        } else{
            echo 'Error.';
        }
    }
}
 
if(isset($_REQUEST['img']) && isset($_REQUEST['p']) && is_numeric($_REQUEST['p'])) {
   $total = 0;
   $limit = 5;
   $start = $_REQUEST['p'];
   $end = $start + $limit;
   $item = 0;
 
   $handle = opendir('../images/'.$_REQUEST['img']);
   while ($file = readdir($handle)){
      if($item >= $start && $item <= $end) {
         $extension = strtolower(substr(strrchr($file, '.'), 1));
         if($extension == 'jpg'){
            $image = json_decode(loadFile('http://yourdomain.com/?i=http://yourdomain.com/images/'.$_REQUEST['img'].'/'.$file));
 
            $change = $image[0]->src_size-$image[0]->dest_size;
 
            if($change > 1) {
               $total += $change;
               downloadfile($image[0]->dest,'../images/'.$_REQUEST['img'].'/',$file);
            }
         } 
      } elseif($item > $end) {
         break;
      }
      $item++;
   } 
 
   echo '[{"total":"'.$total.'"}]';
}
 
?>

The HTML page goes through images in the FOLDER you designate, 5 at a time until you hit image 50010, which you can customize based on your folder size. It sends an ajax request to the php file, then reports back the total KB saved.  It will run the script once a minute 60000, which you can change depending on your server’s performance.

Once the PHP page receives a request it makes sure it has all the required request variables, sets up the counters, then goes through all the images in the designated folder.

The compressor will handle php and gif files, but I only had jpgs in this project so I limited it to only process jpgs.  So once it find a jpg it will send the full URL to your image compression service, you can replace “yourdomain.com” with the url of your service.  Then it decodes the json, and if there was any compression in the file size it will download the new image over the old one.  And once it’s gone through 5 images will report the total KB saved back to the HTML file.

 

Permalink » No comments

Simple Smusher

I wrote a post on compressing images with TinyPNG, but that 500 image limit ran out fast!  So I decided to see if I could put together my own image optimization service.

After a little digging I found the image_optim gem.  It’s very nice and straightforward, and includes image_optim_pack gem which installed all the compression libraries as well – making for a very hassle free set-up.  After getting those installed and setting up my rails project, it really came down to a simple interface with one controller and view.

Controller:

require 'open-uri'
require 'image_optim'
require 'fileutils'
 
class WelcomeController < ApplicationController
  def index
    Dir.glob("public/tmp/*").
      select{|f| File.mtime(f) < (Time.now - (60*120)) }.
      each{|f| File.delete(f) }
 
    image = params[:i]
    extension = File.extname(image)
 
    acceptable_extensions = [".jpg", ".gif", ".png"]
 
    return @json = "{\"error\":\"only jpg, png, and gif can be resized\"}" if !acceptable_extensions.include?( extension.downcase )
 
    value = ""; 8.times{value  << (65 + rand(25)).chr}
    tmp_image = "public/tmp/#{value}#{extension}"
 
    open(tmp_image, 'wb') do |file|
      file << open(image).read
    end
 
    before_filesize = (File.size(tmp_image) * 0.001).floor
 
    image_optim = ImageOptim.new
    image_optim = ImageOptim.new(:pngout => false)
    image_optim = ImageOptim.new(:nice => 10)
 
    image_optim.optimize_image!(tmp_image)
 
    after_filesize = (File.size(tmp_image) * 0.001).floor
 
    tmp_image = "http://yourdomain.com/tmp/#{value}#{extension}"
 
    @json = "[{\"dest\":\"#{tmp_image}\",\"src_size\":#{before_filesize},\"dest_size\":#{after_filesize}}]"
  end
 
end

View:

1
<%= raw @json %>

I’m including open-uri to download the image to be compressed, image_optim to do the compressing, and fileutils to check the file size.

  1. I start with the Dir.glob to first delete any temporary images that are over 120 hours old.
  2. Then I grab the image from the url provided with the i param, and make sure it has the right extention
  3. Then generate a random string and create the temporary filename and save the image there
  4. Then I optimize the image and save over it
  5. Then I just write out the JSON I want by hand – I wrote it to resemble smush.it’s syntax

You will need to change “yourdomain.com” to your actual domain here, looking back I should have done that programmatically – but it’s not a big deal.

I have to whole project available on git hub for download.

Permalink » No comments