ELK on docker + Syslog

Versions used:

  • Ubuntu 16.04
  • Docker version 17.03.1-ce, build c6d412e
  • Docker-Compose version 1.12.0-rc2, build 08dc2a4

Install Docker CE

Install a few basics that we need:

sudo apt-get install \
    apt-transport-https \
    ca-certificates \
    curl \
    software-properties-common

Add docker.com’s GPG key

curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -

Add the apt-get repo:

sudo add-apt-repository \
   "deb [arch=amd64] https://download.docker.com/linux/ubuntu \
   $(lsb_release -cs) \
   stable"

Update package list

sudo apt-get update

Install ‘docker-ce’ package

sudo apt-get install docker-ce

Test to verify all is well

sudo docker run hello-world                                                                                                            

Install Docker-Compose

Get the docker-compose release and pipe to local file

curl -L https://github.com/docker/compose/releases/download/1.12.0-rc2/docker-compose-`uname -s`-`uname -m` > ~/docker-compose

Now move the file to a location within your path

sudo mv ~/docker-compose /usr/local/bin/docker-compose

And make it executable

sudo chmod +x /usr/local/bin/docker-compose

Verify it’ll run by verifying the version

docker-compose --version

Install ELK (Elasticsearch, Logstash and Kibana)

Fortunately this has been made easy for us thanks to a project on GiHub – so to get the basics running we’d just need to clone one repo:

cd ~\
git clone https://github.com/deviantony/docker-elk

Configure ELK stack

Elasticsearch

Move into the directory:

cd docker-elk

Elasticsearch uses a hybrid mmapfs / niofs directory by default to store its indices. The default operating system limits on mmap counts is likely to be too low, which may result in out of memory exceptions. We’ll need to increase the vm.max_map_count value:

Elevate to root:

sudo su -

Increase the value now:

sysctl -w vm.max_map_count=262144

Make it persist across reboots:

Add vm.max_map_count=262144 setting into ​​ /etc/sysctl.conf. You can verify after rebooting by running sysctl vm.max_map_count

Drop back to previous user:

exit

Now lets configure it such that the elasticsearch data (indexes) is held outside of the container – for example:

mkdir -p ~/docker-elk/elasticsearch/data

Now we need to configure the binding in the docker-compose.yml file for the elasticsearch service. Use you favorite editor to open ~/docker-elk/docker-compose.yml and add the following between the build and ports section of the elasticsearch service – while we are here lets set the auto-restart policy too:

  restart: always                                                                                                                                   
    volumes:                                                                                                                                          
      - ./elasticsearch/data:/usr/share/elasticsearch/data

Logstash + ‘Syslog’ input

The default logstash configuration has a TCP listener on 5000, however for my (our?) purposes we want to listen for syslog messages. Now there are syslog logstash inputs around – but “syslog” is not exactly standard, and there are a both of varying RFCs – so instead of using a rigid input – lets use a standard TCP and UDP listener, and have a grok expression do the parsing for us – that way we can tweak if necessary. Most of my syslog sources of rsyslog – so the grok expression I have seems to play nice with that.

Use your editor and open ./logstash/pipeline/logstash.conf and make the contents look like this:

You’ll notice there is provision for if/when grok cannot parse the message, that it will be logged to the host file system at /var/log/failed_syslog_events-. This is useful to look at and see what is failing and adjust the grok expression as appropriate. You’ll also see that port 5000 was selected for syslog – however the syslog standard port is 514 – this is because ports below 1024 require root privilege and I don’t want run logstash as root. Instead we will modify the docker-compose config such that the host will port forward 514 -> 5000.

Lets do that now, use your editor and open `docker-compose.yml’, then configure the ports section of the logstash service as follows, again we will do the restart policy at the same time:

  restart: always
  ports:                                                                                                                                            
      - "514:5000"                                                                                                                                    
      - "514:5000/udp"

While you’re in there, add the same restart policy for the kibana service

Start-up ELK stack

Assumption is you’re in ~/docker-elk directory

In case you’ve started stuff already, stop all:

sudo docker-compose down

Get the docker events to print to the terminal, useful for the first time run/troubleshooting issues:

sudo docker events &

Start it up ELK in the background:

sudo docker-compose up -d

Verify each of the 3 containers for ELK are running: sudo docker ps and you should see the port forwarding is in place for 514 -> 5000 (both TCP and UDP)

At this point it’s good to wait a few minutes – or until the docker-events die down, as it can take quite a few minutes for the full stack to be up and ready.

Kibana needs some data to populate it’s indicies – so lets pump the contents of /var/log/syslog on the host in via TCP on port 514

nc localhost 514 < /var/log/syslog

Now go to a browser and goto your Kibana instance which listens on port 5601 by default:

http://:5601

You’ll be taken to the Index Patterns page, and so long as the data you pumped it was correct (and all prior steps) you should just be-able to accept the defaults and it’ll create a indexing pattern for you.

Go to the ‘Discover’ page and you should hopefully see the logs shown that you pumped in earlier. If you do – your basic ELK stack with syslog is now functional, and you can move onto getting clients sending their logs to it, and start creating visualizations and dashboards on Kibana.

If the docker-events are annoying – do a

sudo killall docker-events

Drop a question/comment and I’ll try and assist.

References:

Advertisements

Home Brew – CO2 Carbonation Chart

Use this chart to determine the ‘set and forget’ settings (temperature and pressure) to apply to your kegged beer to properly carbonate it with a CO2 pressure tank. There are a couple of these around on the net, but usually limited to degrees fahrenheit and a few other annoyances for me, so I took a couple of minutes to put together mine own slightly more simplified version.

Carbonation Chart

Virtualize Raspberry Pi (i.e. ARM) on OS X using QEMU

Here’s how:

  • Install and upgrade Xcode to 4.3 or above.
  • Install the Xcode Command Line Tools (you can do this from within Xcode’s “Downloads” preference pane).
  • Install Homebrew, using the instructions here – http://brew.sh/
  • Force Homebrew to install version 1.1.0 of QEMU
    git checkout 2b7b4b3 Library/Formula/qemu.rb

Sources:

Panoramas on Linux

Larger version: here

This shot is made up of 7 RAW shots from my Canon EOS 400D (aka Rebel XTi) w/ EF-S 17-85mm IS USM lens, here was the process I followed:

The shoot:

1. Mounted the camera on a tripod in portrait orientation, doing this has two benefits… firstly puts the barrel distortion introduced by your lens on the top and bottom making the shots easier to blend and stitch, also it’ll give your pano more height. The down side of course is that you’ll need to take more shots than you would with landscape to get the same field of view.

2. Put the camera in Manual mode to ensure that the exposure is locked. If you use any of the Auto/Semi-auto modes… your camera will re-meter for exposure for each shot – causing the brightness of each shot to differ.

3. Then I selected a specific white balance, in this case ‘daylight’ – but the important thing is not to have it on Auto White Balance, otherwise each shot is likely to be a different temperature.

4. Next I dialed in the aperture to a tiny (F/22), this is to ensure that I get the deepest Depth of Field (DoF) as possible so the foreground and background are sharp and in focus.

5. Then I used the auto focus to do the focus work for me, then once focus was achieved I switched to Manual Focus to ensure that each shot is taken with the same focus.

6. Finally, using my Canon IR remote I shot off the first shot then carefully panned by tripod head until there was approximately 20% overlap from the previous shot then shot off again… then continued until I had the complete field of view I was after.

The post processing:

I did the post processing on my Linux (Ubuntu) box, as a minimum you’ll need the following:

Here is the process I followed:

1. Converted my RAW images to JPEG using dcraw. I used a custom version of the script available here: http://jcornuz.wordpress.com/2008/07/10/here-is-a-little-something-for-your-blog/

2. Renamed the .JPEG output to .JPG, because Autopano-sift-C fails with .JPEG extensions.

3. Batch rotated JPEGs using mogrify from ImageMagick (mogrify -rotate “-90” *.JPG)

4. Opened images in Hugin, and got it to use Autopano-sift-C to automatically find the control points between the set of images.

5. Hugin then uses ‘nano’ to modify the geometry of the images, then ‘enblend’ to stitch them all together.

FYI – I did run into an issue with the latest CVS version of Enblend (v3.2), and had to downgrade to a previous version to make it work.

Excel tip: Sum of visible filtered rows

I am a big fan of the MS Excel AutoFilter feature, and use it extensively. Every now and then I’m wished that there was some function within Excel to do a SUM function, but only on the rows that are visible as a result of the AutoFilter… well few minutes on Google and a found it.

Use the subtotal function with the relevant function_num, for addition I used 109. For example: =SUBTOTAL(109, A1:A1000)

Syntax: SUBTOTAL(function_num, ref1, ref2, …)

Function_num
(includes hidden values)
Function_num
(ignores hidden values)
Function
1 101 AVERAGE
2 102 COUNT
3 103 COUNTA
4 104 MAX
5 105 MIN
6 106 PRODUCT
7 107 STDEV
8 108 STDEVP
9 109 SUM
10 110 VAR
11 111 VAR