Time Wasters

Because you can’t be productive all the time


Exporting data to CSV in rails console

I’ve had a similar post to this in the past where I exported an entire table to CSV, but what if you want to export only a subset of that data? Well, that’s what this post is for 🙂

  1. Start with a query that narrows things down as much as possible
  2. Then loop through those results to apply any additional filtering you need
  3. Finally generate the CSV file with a header
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
date = Date.new(2016,12,15)
l = Table.where("created_at > ? AND column != 'variable'", date)
 
ActiveRecord::Base.logger = nil
results = []
l.each do |i|
 u = User.where(:email => i.email)
 if u.present?
   if Table.where(:email => u[0].email).blank?
     results << {:email=>i.email,:member=>'y'}
   end
 else
   results << {:email=>i.email,:member=>'n'}
 end
end;0
 
 
require 'csv'
 
file = "#{Rails.root}/public/data.csv"
 
CSV.open( file, 'w' ) do |writer|
 writer << results.first.map { |a,v| a }
 results.each do |s|
 writer << s.map { |a,v| v }
 end
end
 
 

Permalink » No comments

simple-zoom

Just finished pushing a very simple zoom app in GitHub, I couldn’t find a nice script that would allow the image to take up the full size of it’s parent element when zoomed in, and would handle updating the image dynamically.  So I wrote one for myself.

Get the files on GitHub.

See a preview:

Permalink » No comments

Monitoring & Understanding Server Load

I have a site whose performance could stand some improvement.  I want to be able to make sure that any changes I make are actually having an impact, so I’m going to start with setting up some monitoring tools. I found a very nice and free monitoring application called Load Average. After I got that all set I did a bit of research so I can fully understand and make the most out of the numbers I see in there.

CPU LOAD

First off, I needed to get a handle on how CPU Load is measured. If you run “cat /proc/loadavg” on a Linux server you will get a string similar to this:

1
0.02 0.03 0.00 1/437 21084

The first three numbers are the average CPU Load over 1 minutes, 5 minutes, and 15 minutes.

The lower the numbers the better, and the highest it should be is equal to the number of cores you have.  With one core it would be bad if it went over 1, but if you had 8 cores then a value of 1 wouldn’t be a problem, but 8.5 would.  To find out how many cores you have you can run “nproc.”

MEMORY USAGE

Check how much memory you’re using by running “free -m.” You’ll get a readout like this:

1
2
3
4
             total       used       free     shared    buffers     cached
Mem:         11909      10785       1124          1        234       9372
-/+ buffers/cache:       1178      10731
Swap:            0          0          0

The number you want to watch out for here is the used buffer/cache, here it’s “1178.”  That’s the memory that’s being used by the applications currently running on your server.  That number should be less then the total memory + swap memory, here’s that’s “11909.”

 

References:

http://stackoverflow.com/questions/11987495/linux-proc-loadavg
http://blog.scoutapp.com/articles/2009/07/31/understanding-load-averages
http://serverfault.com/questions/67759/how-to-understand-the-memory-usage-and-load-average-in-linux-server
http://www.cyberciti.biz/faq/linux-get-number-of-cpus-core-command/

Permalink » No comments

iDrive and hidden files

I’ve been trying to backup all my files to iDrive since August >.o It’s about 500GB so I suppose I shouldn’t be too surprised it’s taken forever, and apparently iDrive is slow, but just yesterday I was cleaning up some files and noticed there were some problematic files that haven’t been helping in the backup process. (over 210K of them in fact!) So I took some time and did 4 things to help get my backups rolling.

#1 Delete all .AppleDouble files

I was working on a Mac for a few years, and have some back-ups from them, and after going through the logs I notice a ton of “.AppleDouble” files. Apparently Macs make an AppleDouble of every.single.file. So…there were literally twice as many files to back up. I can just tell iDrive to ignore those files, and I did, but I also wanted to get rid of them all, since I’m no longer on a Mac. So I loaded up the shell from my QNAP, and ran this command:

rm -rf `find -type d -name .AppleDouble`

Which deletes all directories and their contents named “.AppleDouble” under the current directory.

Also, iDrive came up with a few of these odd errors:

#2 Delete all Icon\r files

IOERROR [/share/MD0_DATA/…/Icon], : No such file or directory

I also found about 20 of these odd Icon files, apparently their placeholders so Mac’s can change the icon of the folder. But their names are awkward so iDrive can’t handle them and I can’t delete them normally. They’re actually named “Icon\r” so I had to log into the shell again and run this command in order to delete them all:

rm $’Icon\r’

#3 Stop generating thumbnails

And last, but not least is my QNAP’s fault, by default file manager it generates .@__thumb folders with icons for every single image. Of which I have quite a few. Log in as admin, and you can disable the file generation this way:

Sakata_remove_icons

That will stop new ones from being created, but not delete all the old ones, you can fire up the shell again and run this command to do that:

rm -rf `find -type d -name .@__thumb`

#4 Upgrade to the latest iDrive

I also noticed that the version of the iDrive app in QNAP’s App Center is out of date, so I grabbed a the latest version here: https://www.idrive.com/qnap-backup

The only problem is you have to download the right one for your NAS, but no matter where I looked I couldn’t find what mine is running. I ended up finding out by opening the iDrive in the App Center, I hovered over the download link, and voila!  It has the version right in the file name.  A little backwards to do it, but I couldn’t find the answer anywhere on the QNAP site.

Screenshot (109)

References:

http://apple.stackexchange.com/questions/31867/what-is-icon-r-file-and-how-do-i-delete-them
http://stackoverflow.com/questions/13032701/how-to-remove-folders-with-a-certain-name
http://forum.qnap.com/viewtopic.php?f=24&t=80532&start=75

Permalink » No comments

Compress Rails project images

We were having some issues with slow loading times on our site, so I set out to compress all of our uploaded images.  Since we have thousands of images doing it by hand is not really an option, and plenty of the images are transparent PNGs.  So I enlisted the help of TinyPNG.

TinyPNG is an online application that compresses both PNG and JPG images, and preserves the alpha layer for PNG images, and it offers an API along with a ruby gem.  So you can use it to automate compressing images.  First start by installing the gem, and I also use the fastimage gem for this script so if you don’t have it install that too.

1
2
3
gem install tinify
gem install fastimage
rbenv rehash (if you're using rbenv)

Now you just need the script below:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
require "tinify"
require 'fastimage'
Tinify.key = "YOUR API KEY"
 
i = 0
Dir.glob("public/system/rich/rich_files/rich_files/**/*.{jpg,png}") do |item|
  #only do 50 images at a time
  break if i == 10
 
  #ignorre directories and original files
  next if item == '.' or item == '..' or item.include? "original"
 
  #ignore iamges less than 50KB
  filesize = (File.size(item) * 0.001).floor
  next if filesize < 50
 
  #ignore images that are smaller thatn 100KB and over 600px large
  size = (FastImage.size(item).inject(:+))
  next if size >= 600 and filesize < 100
  i = i + 1
 
  source = Tinify.from_file(item)
  source.to_file(item)
 
  new_filesize = (File.size(item) * 0.001).floor
 
  puts "Converted: #{item}"
  puts "Original: #{filesize} Optimized #{new_filesize}"
end
 
puts "All done!"

Some things you’ll need to address to get this working are first, swap out “YOUR API KEY” with your actual API key, you can get one here. And if you’re not using the rich gem to handle image uploads you’ll need to change “public/system/rich/rich_files/rich_files” to whatever the location of your images is. The “**/*” part of the path will search all sub-directories and files in those sub-directories, and “.{jpg,png}” will only search jpg and png images.

The code will go through all the JPG and PNG files in your images directory, it will ignore all original versions, and then ignore files that are too small to bother with and then print out the files that were compressed along with their original and optimized sizes, then print out “All done!” once it’s finished.

Save it as tinify.rb and you can run it in the console with:

1
ruby tinify.rb

This will process 50 images at a time, and ignore images that I’ve set as too low to both with, so all images under 50KB are ignored, and images over 600x600px and lower than 100KB are ignored.  But this does pose a problem – images that aren’t reduced enough will still get re-reduced if you run the script multiple times.  For now I’m just monitoring it, and I’ll adjust it if too many images get caught.

Permalink » No comments