How to version large files with Git LFS How to version large files with Git LFS

Versioning large files (such as audio samples, videos, datasets, and graphics) can be difficult when working with distributed version control systems like Git. Fortunately, a new extension to Git makes handling of large files easier: Git Large File Storage (LFS) is an open-source project that replaces large files with text pointers inside Git, while storing the contents of the files on a remote server like GitHub or an AWS bucket.

After running the installation script, set up LFS via the following command:

$ git lfs install
Tracking file types
All you need to do now is to tell Git LFS which file types to track. Navigate to your Git repository, and issue a git lfs track command. For example, if you want Git LFS to automatically handle all .mat files in your repository (although it’s rarely a smart idea to have binaries under version control), you would call:

$ git lfs track “*.mat”
If your Git repository has subdirectories, you can use globbing to track all .mat files in all subdirectories:

$ git lfs track “**/*.mat”
Or you can track single files:

$ git lfs track myLargeFile.mat
That’s it! Continue your work using git commit and git push as usual.

Storing large files
If you have tried uploading large files to the remote repository before, you might have noticed a warning popping up telling you that GitHub does not recommend to upload files larger than 50MB. You won’t even be able to upload files larger than 100MB. With Git LFS installed, the file will instead be uploaded to a dedicated remote host that is different from your remote repository, and the git push command will go through as usual:

$ git commit -am “add large file”
$ git push origin master

Instead of storing the file in the remote repository, Git LFS will upload only a small file reference. If you try to inspect the file on GitHub, you will only find the following note:

Back in the local repository, you will notice that the file is still accessible, until you switch branches.

Retrieving large files
As soon as you switch branches, the locally stored binaries will be gone. If you now inspect the file controlled by Git LFS, all you will find is a tiny text file that might look something like this:

oid sha256:d63d7c81d9191f17263b0c65f97101083dade9637e069aea23c6be778cbf89bdf
size 68536835
So where did your file go, you might ask? It is still on the LFS remote host. To download the file from the remote host, use the following command:

$ git lfs fetch
To see a list of all LFS-related commands, simply type:

$ git lfs Glenn Greenwald: Why the CIA is smearing Edward Snowden after the Paris attacks Glenn Greenwald: Why the CIA is smearing Edward Snowden after the Paris attacks

Decent people see tragedy and barbarism when viewing a terrorism attack. American politicians and intelligence officials see something else: opportunity.

Bodies were still lying in the streets of Paris when CIA operatives began exploiting the resulting fear and anger to advance long-standing political agendas. They and their congressional allies instantly attempted to heap blame for the atrocity not on Islamic State but on several preexisting adversaries: Internet encryption, Silicon Valley’s privacy policies and Edward Snowden.

In one sense, this blame-shifting tactic is understandable. After all, the CIA, the NSA and similar agencies receive billions of dollars annually from Congress and have been vested by their Senate overseers with virtually unlimited spying power. They have one paramount mission: find and stop people who are plotting terrorist attacks. When they fail, of course they are desperate to blame others.

The CIA’s blame-shifting game, aside from being self-serving, was deceitful in the extreme. To begin with, there still is no evidence that the perpetrators in Paris used the Internet to plot their attacks, let alone used encryption technology.

The claim that the Paris attackers learned to use encryption from Snowden is even more misleading. For many years before anyone heard of Snowden, the U.S. government repeatedly warned that terrorists were using highly advanced means of evading American surveillance.

Then-FBI Director Louis Freeh told a Senate panel in March 2000 that “uncrackable encryption is allowing terrorists — Hamas, Hezbollah, Al Qaeda and others — to communicate about their criminal intentions without fear of outside intrusion.”

Or consider a USA Today article dated Feb. 5, 2001, eight months before the 9/11 attack. The headline warned “Terror groups hide behind Web encryption.” That 14-year-old article cited “officials” who claimed that “encryption has become the everyday tool of Muslim extremists.”

Even the official version of how the CIA found Osama bin Laden features the claim that the Al Qaeda leader only used personal couriers to communicate, never the Internet or telephone.

Within the Snowden archive itself, one finds a 2003 document that a British spy agency called “the Jihadist Handbook.” That 12-year-old document, widely published on the Internet, contains instructions for how terrorist operatives should evade U.S. electronic surveillance.

In sum, Snowden did not tell the terrorists anything they did not already know. The terrorists have known for years that the U.S. government is trying to monitor their communications.

What the Snowden disclosures actually revealed to the world was that the U.S. government is monitoring the Internet communications and activities of everyone else: hundreds of millions of innocent people under the largest program of suspicionless mass surveillance ever created, a program that multiple federal judges have ruled is illegal and unconstitutional.

That is why intelligence officials are so eager to demonize Snowden: rage that he exposed their secret, unconstitutional schemes.

W7PUA Home Page

W7PUA Home Page

Welcome to my Web Page. Information is available here on electronic hobby projects that I’ve been developing. In addition, I plan to add some things about my other interests, including boats and the outdoors. Bob Larkin, W7PUA, Corvallis, Oregon

The DSP-10 is a two-meter ham-radio transceiver using a DSP processor for the i-f and audio. It was published in QST magazine for September, October and November of 1999. It can be used for communicating with very weak signals and has special features for use with microwave transverters.

Most of my ham operation is at VHF, UHF and Microwave frequencies.

I also enjoy sailing, paddling and building wooden boats.

Radbahn Berlin

Radbahn Berlin

The idea behind Radbahn is simple!

We want to use the undeveloped space under the Berlin’s elevated railway U1 for cycling. 8.9 kilometers of covered bicycle path through the city – that would be unique in the world and for Berlin its a step into the future.

From City-West to Friedrichshain

The Radbahn leads from the center of West through one of the most modern and most popular parks in Berlin, along the canal, and finally through several creative and wild Hotspots of Kreuzberg, and over the Oberbaumbrücke to Friedrichshain. Audience Data and Analytics for Digital Media Publishers Audience Data and Analytics for Digital Media Publishers

The first part of the two step integration with tracking, is the installation of the tracking code. Jump to documentation about the metadata tag once you have the JavaScript running!

The Tracker is a small piece of JavaScript code that monitors user actions taken on your site and relays them to the analytics server.

To get started, insert the Tracker in the footer of your website template. Ideal placement is as the last code block before the closing of the body tag. If your website uses a templating system, this is usually in the “footer” template.

JavaScript implementation is the recommended way of tracking, but it is also possible to send the pixel data to the analytics server from your site’s serverside. This page describes and demonstrates how to format HTTP requests to send visitor data in such case.

To send visitor data, make a HTTP GET request to The request headers should contain User-Agent string that would be set to of the user device information (browser, version, etc.) initiating the action.

When using our JavaScript implementation, JavaScript implementation, you may need to be aware of privacy considerations surrounding’s data collection approach.

By default, collects standard web browser information about a reader, the uses of which are described below:

ip_address: IP address of the user; used for bot blocking and geo trends
user_agent: Identifier for the user’s device; used for device analytics (mobile vs desktop)
first_party_uuid: Site-specific identifier (UUID) for user; used in loyalty analytics (new vs returning)
third_party_uuid: Network unique identifier (UUID) for user; used in aggregate benchmarking
To comply with Personally Identifiable Information (PII) restrictions in certain geographic regions, as well as the privacy policies of individual publishers, can selectively disable the tracking of two of these pieces of information (individually for a specific site or API key). These are ip_address and third_party_uuid.

Disabling ip_address tracking will prevent from being able to do bot blocking for your site, but may lead to better compliance with strict privacy policies that consider IP addresses to be PII. also has plans to eventually incorporate geographic information in its dashboards, but blocking IP addresses will eliminate any possibility for this feature to be deployed for publishers on which ip_address tracking is disabled.

Disabling third_party_uuid will prevent from being able to do network-wide benchmarking for your site, but it will not disable any other functionality.