Scripting

Ramblings and links to interesting snippets


HTTPS subdomains

This is pretty much just ripped from this article.  But it took me forever to find a solution, so I’m writing this down!

It is possible to secure a sub-domain without a separate IP or SSL certificate but only if it is a wildcard certificate!  Just Host offers these, but what it doesn’t offer is decent support on the matter 😛

When you make a subdomain, say “sub.example.com”  going to http://sub.example.com – will be just fine.  But, going to https://sub.example.com will load up the contents of your default directory (a.k.a. public_html)  So, in order to get that to work you have to force it to point to the right directory using .htacces, like so:

1
2
3
4
5
RewriteEngine On
RewriteCond %{SERVER_PORT} ^443$
RewriteCond %{HTTP_HOST} ^sub.example.com$ [NC]
RewriteCond %{REQUEST_URI} !^/sub-folder/
RewriteRule ^(.*) /sub-folder/$1

Permalink » No comments

Compress images with simple smusher

I wrote a similar post using TinyPNG about a week ago, but the 500 image limit was a bit small so I built my own very simple image compressor.  So I’ve updated the code to work with my simple smusher, added a few improvements, and I have a PHP version as well.

Ruby Version

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
require 'open-uri'
require "json"
require "net/http"
require "uri"
require 'rubygems'
 
start = (ARGV[0] ? ARGV[0] : 0).to_i
last = start+100
i = 0
total_savings = 0
Dir.glob("public/system/rich/rich_files/rich_files/**/*.{jpg,png}") do |item|
  #only do 100 images at a time
  break if i == last
 
  #ignorre directories and original files
  next if item == '.' or item == '..' or item.include? "original"
  i = i + 1
  next if i <= start
 
  url = item.sub 'public', 'http://www.jimstoppani.com'
  uri = URI.parse("http://yourdomain.com/?i=#{url}")
  http = Net::HTTP.new(uri.host, uri.port)
  http.read_timeout = 500
  request = Net::HTTP::Get.new(uri.request_uri)
  response = http.request(request)
 
  if response.code == "200"
    result = JSON.parse(response.body)
  else
    break
  end
 
  puts "Converted: #{item}"
 
  if result[0]['dest_size'] < result[0]['src_size']
    open(item, 'wb') do |file|
      file << open(result[0]["dest"]).read
    end
    puts "Original: #{result[0]['src_size']} Optimized #{result[0]['dest_size']}"
    total_savings = total_savings + (result[0]['src_size']-result[0]['dest_size'])
  else
    puts "no size difference"
  end
end
 
puts "All done! Total Savings: #{total_savings}"

This one has a few advantages over the last version – it needs a variable sent with it so it doesn’t run into the problem of re-compressing images that have already been compressed.  It does 100 images at once, so you could run it in a loop like “ruby tinify.rb 0”, “ruby tinify.rb 100”, “ruby tinify.rb 200” ect.

It also checks the response from the compressor, and it there’s no file size savings it doesn’t copy over the new version. And once it’s done, it will tell you the total savings.

PHP Version

HTML:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
    <title>Photo Smusher</title>
    <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js"></script>
 
    <script>
        $(function(){
        var total = 0;
        var start = 0;
 
 
        function smush_it() {
            var jqxhr = $.getJSON("smush.php?img=FOLDER&p="+start, function(data) {
              total = total + parseFloat(data[0].total);
              start = start+5;
              $("#total").text(total);
              $("#start").text(start);
              if(start < 5010) {
                  smush = setTimeout(function(){smush_it()},60000);
              }
            }).fail(function( jqxhr, textStatus, error ) {
                var err = start + ' ' + textStatus + ", " + error;
                smush = setTimeout(function () {
                    smush_it()
                }, 60000);
                $("#error").text(err);
            });
        }
 
        $("#total").text(total);
        $("#start").text(start);
        smush_it();
 
        });
    </script>
</head>
 
<body>
    <p>Progress: <span id="start"></span></p>
    <p>KB saved: <span id="total"></span></p>
    <p id="error"></p>
</body>
</html>

PHP:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
<?php
function loadFile($url) {
    $ch = curl_init();
 
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($ch, CURLOPT_URL, $url);
 
    $data = curl_exec($ch);
    curl_close($ch);
 
    return $data;
}
 
 
function downloadfile($file, $path, $fn) {
    if(isset($file) && isset($path) && isset($fn)) {
        $fc = implode('', file($file));
 
        $mode = (file_exists($path . $fn) ? 'w': 'x+');
        $Files = fopen($path . $fn, $mode);
 
        if ((fwrite($Files, $fc)) != 0){
            fclose($Files);
        } else{
            echo 'Error.';
        }
    }
}
 
if(isset($_REQUEST['img']) && isset($_REQUEST['p']) && is_numeric($_REQUEST['p'])) {
   $total = 0;
   $limit = 5;
   $start = $_REQUEST['p'];
   $end = $start + $limit;
   $item = 0;
 
   $handle = opendir('../images/'.$_REQUEST['img']);
   while ($file = readdir($handle)){
      if($item >= $start && $item <= $end) {
         $extension = strtolower(substr(strrchr($file, '.'), 1));
         if($extension == 'jpg'){
            $image = json_decode(loadFile('http://yourdomain.com/?i=http://yourdomain.com/images/'.$_REQUEST['img'].'/'.$file));
 
            $change = $image[0]->src_size-$image[0]->dest_size;
 
            if($change > 1) {
               $total += $change;
               downloadfile($image[0]->dest,'../images/'.$_REQUEST['img'].'/',$file);
            }
         } 
      } elseif($item > $end) {
         break;
      }
      $item++;
   } 
 
   echo '[{"total":"'.$total.'"}]';
}
 
?>

The HTML page goes through images in the FOLDER you designate, 5 at a time until you hit image 50010, which you can customize based on your folder size. It sends an ajax request to the php file, then reports back the total KB saved.  It will run the script once a minute 60000, which you can change depending on your server’s performance.

Once the PHP page receives a request it makes sure it has all the required request variables, sets up the counters, then goes through all the images in the designated folder.

The compressor will handle php and gif files, but I only had jpgs in this project so I limited it to only process jpgs.  So once it find a jpg it will send the full URL to your image compression service, you can replace “yourdomain.com” with the url of your service.  Then it decodes the json, and if there was any compression in the file size it will download the new image over the old one.  And once it’s gone through 5 images will report the total KB saved back to the HTML file.

 

Permalink » No comments

Simple Smusher

I wrote a post on compressing images with TinyPNG, but that 500 image limit ran out fast!  So I decided to see if I could put together my own image optimization service.

After a little digging I found the image_optim gem.  It’s very nice and straightforward, and includes image_optim_pack gem which installed all the compression libraries as well – making for a very hassle free set-up.  After getting those installed and setting up my rails project, it really came down to a simple interface with one controller and view.

Controller:

require 'open-uri'
require 'image_optim'
require 'fileutils'
 
class WelcomeController < ApplicationController
  def index
    Dir.glob("public/tmp/*").
      select{|f| File.mtime(f) < (Time.now - (60*120)) }.
      each{|f| File.delete(f) }
 
    image = params[:i]
    extension = File.extname(image)
 
    acceptable_extensions = [".jpg", ".gif", ".png"]
 
    return @json = "{\"error\":\"only jpg, png, and gif can be resized\"}" if !acceptable_extensions.include?( extension.downcase )
 
    value = ""; 8.times{value  << (65 + rand(25)).chr}
    tmp_image = "public/tmp/#{value}#{extension}"
 
    open(tmp_image, 'wb') do |file|
      file << open(image).read
    end
 
    before_filesize = (File.size(tmp_image) * 0.001).floor
 
    image_optim = ImageOptim.new
    image_optim = ImageOptim.new(:pngout => false)
    image_optim = ImageOptim.new(:nice => 10)
 
    image_optim.optimize_image!(tmp_image)
 
    after_filesize = (File.size(tmp_image) * 0.001).floor
 
    tmp_image = "http://yourdomain.com/tmp/#{value}#{extension}"
 
    @json = "[{\"dest\":\"#{tmp_image}\",\"src_size\":#{before_filesize},\"dest_size\":#{after_filesize}}]"
  end
 
end

View:

1
<%= raw @json %>

I’m including open-uri to download the image to be compressed, image_optim to do the compressing, and fileutils to check the file size.

  1. I start with the Dir.glob to first delete any temporary images that are over 120 hours old.
  2. Then I grab the image from the url provided with the i param, and make sure it has the right extention
  3. Then generate a random string and create the temporary filename and save the image there
  4. Then I optimize the image and save over it
  5. Then I just write out the JSON I want by hand – I wrote it to resemble smush.it’s syntax

You will need to change “yourdomain.com” to your actual domain here, looking back I should have done that programmatically – but it’s not a big deal.

I have to whole project available on git hub for download.

Permalink » No comments

Simple reference for creating a Rails migration

I keep forgetting the rails generator syntax for migrations :X  I can never find just a quick easy guide for it – so I thought I’d write one up!

Instructions

  1. cd …
    obviously first make sure your in the main project folder in the console
  2. export RAILS_ENV=environment
    Also, make sure you’re in the right environment, you will probably want to make the change first in your test or development environment and then put it to production
  3. rails g migration AddColumnToModel column_name:column_type
    Here’s the main part, on  older versions rails g migration will be script/generate migration.  The “AddColumnToModel” syntax is important and should relate to the migration you’re making like “AddEmailToUser” and then at the end you can add a “column_name:column_type” reference for each column you want to add, for instance with the last example you would put “email:string
  4. rake db:migrate
    run the migration and add those columns to the database!  You’ll probably want to review the migration that was generated first though – it’ll be the latest file in the …/db/migrate directory

AND MAKE SURE TO RESTART YOUR SERVER AFTER MAKING A CHANGE

Column Types

  • binary: for files/ data blob
  • boolean: true/false
  • date: only date: (year, month, day)
  • datetime: date + time
  • decimal: precise decimal numbers – for when math neds to be accurate
  • float: decimal numbers
  • integer: whole numbers
  • string: 255 max alphanumeric characters
  • text: unlimited alphanumeric characters
  • time: only time (hours, minutes, seconds)
  • timestamp: same as datetime

 

And, if you make a mistake a quick way to go back and fix it is “rake db:migrate:redo” that will undo your last change, and re-run it.

Also, if you want to only run “rake db:migrate:up VERSION=20090408054532”

In a pinch you can also run a migration by hand in the console with the following format:
ActiveRecord::Migration.add_column :table, :column_name, :column_type

References:
http://stackoverflow.com/questions/11889048/is-there-documentation-for-the-rails-column-types
http://stackoverflow.com/questions/15162055/rails-generate-migration
http://stackoverflow.com/questions/7694487/ruby-on-rails-how-can-i-revert-a-migration-with-rake-dbmigrate

Permalink » No comments

Social Login with Phonegap, Omniauth, Tokens, and InAppBrowser

For a project I needed to integrate Facebook and Twitter logins to a phonegap app.  I went through a ton of plugins and jsOAuths, but they were all quite cumbersome. I really wanted a simpler approach that would handle Twitter and Facebook at the same time.

This article about Twitter integration with Child Browser got me started in the right direction.  It’s a little dated with Child Browser, so I set out to update it to work with InAppBrowser, but during the process I realized that I could actually handle this issue a lot simpler.  Mostly because I only needed to login and didn’t require any deeper integration with Social platforms.

So after a little more digging I came across this lovely article on cross window communication with InAppBrowser which was everything I needed to take care of business!

In my case, the server is going to be doing all the heavy lifting, and for this project we already had Omniauth set-up for social login and tokens so I only needed to make a few tweaks to get this rolling.

The first tweak was to get it to output JSON for the App.  Out of the box Omniauth will keep track of your params, so I made it return a JSON response when I requested it in the query string like so “/users/auth/facebook?json=true”

Just for example, here how it would look if you were using the standard facebook action for omniauth:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
class Users::OmniauthCallbacksController < Devise::OmniauthCallbacksController
  def facebook
    @user = User.from_omniauth(request.env["omniauth.auth"])
 
    if @user.persisted?
      sign_in @user, :event => :authentication #this will throw if @user is not activated
    else
      session["devise.facebook_data"] = request.env["omniauth.auth"]
      return redirect_to new_user_registration_url
    end
 
    return render :json => current_user if request.env["omniauth.params"]["json"]
 
    set_flash_message(:notice, :success, :kind => "Facebook") if is_navigational_format?
    redirect_to "/"
  end
end

The important parts here are:

  • I’m using “sign_in” instead of “sign_in_and_redirect” so I can handle the redirection later down the line.
  • Then I’m adding in “return render :json => current_user if request.env[“omniauth.params”][“json”]” so the callback will output the current user in JSON.

In my case the JSON output for current_user includes a token I can use to integrate with the API.

With that all set you’re ready to capture that JSON in phonegap:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
win = window.open(base_url+'/users/auth/facebook?json=true', '_blank', 'location=yes,enableViewPortScale=yes');
win.addEventListener('loadstop', function (e) {
  if (e.url.indexOf(base_url) >= 0 && e.url.indexOf('callback') >= 0) {
    win.executeScript(
      { code: "document.body.innerHTML" },
      function(values) {
        r = values[0];
        r = r.replace(r.substring(0, r.indexOf('{')),"").replace(r.substring(r.indexOf('}'),r.length).replace(/}/g,''),"");
        r = jQuery.parseJSON(r);
        //now r.api_token is available to use however you need in your app
        win.close();
      }
    );
  }
});

This works by:

  1. First opening a window with the social login page “/users/auth/facebook?json=true
    on my app base_url is the domain of the API, like http://api.google.com for example.
  2. Next I’m listening for every-time the page in InAppBrowser finishes loading with “addEventListener(‘loadstop‘”
  3. Then I’m waiting for the URL to have my API’s domain name plus “callback” here “if (e.url.indexOf(base_url) >= 0 && e.url.indexOf(‘callback’) >= 0) {
  4. Once there I use “executeScript” to fetch the document body, which is returned as values[0].
  5. Then I clean the return value of any HTML using replace and substring, then parse that to JSON

After that you can use that token in your App however you need to integrate with your API.

Permalink » No comments