Posts tagged ‘Other’

Krebs on Security: Spam Nation Book Tour Highlights

This post was syndicated from: Krebs on Security and was written by: BrianKrebs. Original post: at Krebs on Security

Greetings from sunny Austin, Texas, where I’m getting ready to wrap up a week-long book tour that began in New York City, then blazed through Chicago, San Francisco, and Seattle. I’ve been trying to tweet links to various media interviews about Spam Nation over the past week, but wanted to offer a more comprehensive account and to share some highlights of the tour.

For three days starting last Sunday, I was in New York City — doing a series of back-to-back television and radio interviews. Prior to leaving for New York, I taped television interviews with Jeffrey Brown at the PBS NewsHour; the first segment delves into some of the points touched on in the book, and the second piece is titled “Why it’s harder than you think to go ‘off the grid’.”

cbs-tm

On Monday, I was fortunate to once again be a guest on Terri Gross‘s show Fresh Air, which you can hear at this link. Tuesday morning began with a five-minute appearance on CBS This Morning, which included a sit-down with Charlie Rose, Gayle King and Norah O’Donnell. Later in the day, I was interviewed by the MarketPlace Tech ReportMSNBC’s The Cycle, as well as the Tavis Smiley show. Wednesday was a mercifully light day, with just two interviews: KGO-AM and the Jim Bohannon Radio Show.

Thursday’s round of media appearances began at around sunrise in the single-digit temperature Chicago suburbs. My driver from the hotel to all of these events took me aback at first. Roxanna was a petite blonde from Romania who could have just as easily been a supermodel. I thought for a moment someone was playing a practical joke when I first heard her “Gud mornink Meester Krebs” in a Eastern European accent upon stepping into her Town Car, but Roxanna was a knowledgeable driver who got us everywhere on time and didn’t take any crap from anyone on the road.

wcl-ji The first of those interviews was a television segment for WGN News and a taped interview with TouchVision, followed by my first interview in front of a studio audience at Windy City Live.  The guest who went on right before me was none other than the motivational speaker/life coach Tony Robbins, who is a tough act to follow and was also on the show to promote his new book. At six feet seven inches, Robbins is a larger-than-life guy whose mere presence almost took up half the green room. Anyway Mr. Robbins had quite the security detail, so I took this stealthie of Tony as he was confined to the makeup chair prior to his appearance.

On Thursday afternoon, after an obligatory lunch at the infamous Billy Goat burger joint (the inspiration for the “Cheezborger, cheezborger, cheezborger” Saturday Night Live skit) I visited the Sourcebooks office in Naperville, met many of the folks who worked on Spam Nation, signed a metric ton of books and the company’s author wall.

The Spam Nation signing in Naperville, IL.

The Spam Nation signing in Naperville, IL.

After an amazing dinner with my sister and the CEO of Sourcebooks, we headed to my first book signing event just down the street. It was a well-attended event with some passionate readers and fans, including quite a few folks from @BurbsecWest with whom I had beers afterwards.

On Friday, I hopped a plane to San Francisco and sat down for taped interviews with USA Today and Bloomberg News. The book signing that night at Books Inc. drew a nice crowd and also was followed by some after-event celebration.

Departed for Seattle the next morning, and sat down for a studio interview with longtime newsman (and general mensch) Herb Weisbaum at KOMO-AM. The signing in Seattle, at Third Place Books, was the largest turnout of all, and included a very inquisitive crowd that bought up all of the copies of Spam Nation that the store had on hand.

Yours Truly at a book signing in Seattle's Third Place Books.

Book signing at Seattle’s Third Place Books.

If you’re planning to be in Austin tonight — Nov. 24 — consider stopping by B&N Arboretum at 7:00 p.m. and get your copy of Spam Nation signed. I’ll be holding one more signing — 7:00 p.m. in Washington, D.C.’s Politics & Prose on Dec. 4.

For those on the fence about buying Spam Nation, Slate and LinkedIn both ran excerpts of the book. Other reviews and interviews are available at Fortune.com, Yahoo NewsCreditCards.com. Also, I was interviewed at length several times over the past month by CBS’s 60 Minutes, which is doing a segment on retail data breaches. That interview could air as early as Nov. 30. On that note, the Minneapolis Star Tribune ran a lengthy story on Sunday that followed up on some information I first reported a year ago about a Ukrainian man thought to be tied to the Target breach, among others.

Raspberry Pi: ramanPi: an open source 3D-printable Raman spectrometer

This post was syndicated from: Raspberry Pi and was written by: Helen Lynn. Original post: at Raspberry Pi

The 2014 Hackaday Prize offered fabulous prizes for the best exemplars of an open, clearly documented device involving connected electronics. Committed hardware hacker fl@c@ (we understand that’s pronounced “flatcat”) wasn’t in the habit of opening up their work, but had been thinking that perhaps they should, and this seemed the perfect opportunity to give it a go. They decided to make an entry of one of their current works-in-progress, a DIY Raman spectrometer based on a Raspberry Pi. The project, named ramanPi, made it to the final of the contest, and was declared fifth prize winner at the prize announcement in Munich a couple of weeks ago.

ramanPi optics overview

Raman spectroscopy is a molecular identification technique that, like other spectroscopic techniques, works by detecting and analysing the characteristic ways in which substances absorb and emit radiation in various regions of the electromagnetic spectrum. It relies on the phenomenon of Raman scattering, in which a tiny proportion of the light falling on a sample is absorbed and then re-emitted at a different frequency; the shift in frequency is characteristic of the structure of the material, and can be used to identify it.

The ideal molecular identification technique is sensitive (requiring only small quantities of sample), non-destructive of the sample, unambiguous, fast, and cheap; spectroscopic methods perform pretty well against all but the final criterion. This means that fl@c@’s Raman spectrometer, which uses a Raspberry Pi and 3D-printed parts together with readily available off-the-shelf components, removes an obstacle to using a very valuable technique for individuals and organisations lacking a large equipment budget.

The ramanPi uses a remote interface so that it can be viewed and controlled from anywhere. Like conventional Raman spectrometers, it uses a laser as a powerful monochromatic light source; uniquely, however, its design:

[…] is based on an open source concept that side steps the expensive optics normally required for raman spectroscopy. Ordinarily, an expensive notch filter would be used which is cost prohibitive for most average people. My system avoids this cost by using two less expensive edge filters which when combined in the correct manner provide the same benefit as the notch filter…at the minimal cost of a little extra computing time.

Once a cuvette containing the sample to be tested is loaded into the ramanPi, the laser is powered up behind a shutter and the first filter is selected while the cuvette’s temperature is stabilised. Then the shutter is disengaged and the sample exposed to laser light, and scattered light is collected, filtered and passed to a Raspberry Pi camera module for capturing and then analysis. The laser shutter is re-engaged and the process is repeated with the second filter. The Raspberry Pi combines multiple exposures into a single image and carries out further image processing to derive the sample’s Raman spectrum. Finally, the spectrum is compared with spectra in online databases, and any match found is displayed.

fl@c@ says,

I’ve been trying to build up the courage to share my work and ideas with the world because I think it benefits everyone. This project is my first to share, and for it to be featured here [in a Hackaday Prize Hacker bio] […] is really amazing. I appreciate this whole community, I’ve learned a lot from it over the years and I hope to be able to give back and contribute more soon!

We’re very glad fl@c@ did decide to share this – ramanPi is an astonishing first contribution to the open source movement, and something that’s likely to be of interest to schools, chemists, biologists, home brew enthusiasts, people who want to know what’s in their water, businesses, ecologists and the simply curious.

You can read about ramanPi in much more detail, with further videos, diagrams, discussion and build instructions, on its Hackaday project page. We hope that this is far from the last we’ll hear of this project, or of fl@c@!

TorrentFreak: Pirate Bay Founder Preps Appeal, Puts the Press Straight

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

After being arrested in Cambodia during September 2012 it soon became clear that two Scandinavian countries wanted to get their hands on Gottfrid Svartholm.

Sweden had a long-standing interest in their countryman for his infamous work on The Pirate Bay, but once that was out-of-the-way a pair of hacking cases had to be dealt with.

The first, in Sweden, resulted in partial successes for both sides. While Gottfrid was found guilty of hacking into IT company Logica, following testimony from Jacob Appelbaum he was later cleared by the Appeal Court (Svea Hovrätt) of hacking into Nordea Bank.

But despite this significant result and a repeat appearance from Appelbaum, the trial that concluded in Denmark last month went all one way, with Gottfrid picking up a three-and-a-half year sentence.

With his mother Kristina acting as go-between, TorrentFreak recently fired off some questions to Gottfrid to find out how he’s been bearing up following October’s verdict and to discover his plans for the future.

Firstly, TF asked about his opinion on the decision. Gottfrid declined to answer directly but indicated we should look to the fact that he has already filed an appeal against the verdict. That should be enough of an answer, he said.

As it stands and considering time served, Gottfrid could be released as early as August 2015, but that clearly isn’t deterring him from the possibility of leaving sooner. Gottfrid has always shown that he’s both stubborn and a fighter, so sitting out his sentence in silence was probably never an option.

Moving on, TF pressed Gottfrid on what he feels were the points of failure during the court process and how these will play out alongside his appeal.

“Can’t discuss defense strategy at this point,” he responded. Fair enough.

Even considering the preparations for an appeal, there are a lot of hours in the coming months that will prove hard to fill. However, Gottfrid’s comments suggest that his access to books has improved since his days in solitary confinement and he’s putting that to use.

“I study neurobiology and related subjects to pass the time,” he says, with mother Kristina noting that this education is self-motivated.

“The ‘arrest house’ can of course not provide him with opportunities for higher studies,” she says.

Although he’s been thrust into the public eye on many occasions, Gottfrid’s appearances at court in Sweden (documented in TPB AFK) and later in his Danish trial reveal a man with an eye for detail and accuracy. It perhaps comes as little surprise then that he also took the opportunity to put the record straight on something he knows a lot about – the history of The Pirate Bay.

If one searches for “founders of The Pirate Bay” using Google, it’s very clear from many thousands of reports that they are Gottfrid Svartholm, Fredrik Neij and Peter Sunde. According to Gottfrid, however, that simply isn’t true.

“TPB was founded by me and two people who haven’t been involved since 2004,” Gottfrid says. “Fredrik came into the picture when the site moved from Mexico to Sweden, probably early 2004.”

While acknowledging Fredrik’s work as important for the growth of the site, Gottfrid noted that Peter’s arrival came sometime later. He didn’t specify who the other two founders were but it’s likely they’re to be found among the early members of Piratbyrån as detailed here.

With Peter Sunde already released from his sentence and Fredrik Neij close to beginning his, it’s possible that the founders trio could all be free men by the end of 2015. So does Gottfrid have anything exciting up his sleeve for then?

“Yes, I have plans, but I’m not sharing them,” he concludes.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Piracy Monetization Firm Rightscorp Sued for Harassment and Abuse

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

rightscorp-realCopyright holders have been sending DMCA takedown notices to ISPs for over a decade, but in recent years these warnings turned into revenue opportunities.

Companies such as Rightscorp ask U.S. ISPs to forward DMCA notices to subscribers,with a settlement offer tagged on to the end. On behalf of Warner Bros, BMG and others Rightscorp asks subscribers to pay $20 per pirated file or risk a potential $150,000 in court.

In recent months there have been various complaints from people who were aggressively approached by Rightscorp, which has now resulted in a class-action complaint against the piracy monetization firm.

The lawsuit was filed at a California federal court on behalf of Karen Reif, Isaac Nesmith and others who were approached by Rightscorp. In the complaint, Rightscorp is accused of violating the Telephone Consumer Protection Act, violations of debt Collection laws and Abuse of Process.

One of the allegations describes the repeated use of robo-calls to alleged infringers. A summary of what happened to Karen Reif shows that once Rightscorp knows who you are, they don’t give up easily.

“By late September of 2014, Ms. Reif was receiving on average about one robo-call per day, and sometimes one robo-call and one live call in the same day.These calls came in from a variety of different numbers, from different area codes all over the country,” the complaint alleges.

This bombardment of harassing robo-calls is a violation of the Telephone Consumer Protection Act, the lawyers argue.

The class-action further includes a long list of violations regarding Rightscorp’s debt collection practices, violating both the FDCPA and the Rosenthal Act.

“Among other wrongful conduct: Rightscorp has engaged in telephone harassment and abuse; made various false and misleading representations; engaged in unfair collections practices; failed to provide validation and required notices relating to the debts..,” the complaint reads.

In addition to the above Rightscorp allegedly made false representations that ISPs were participating in the debt collection. For example, the warning letter stated that ISPs would disconnect repeat infringers, something that rarely happened.

Finally, the complaint raises the issue of Rightscorp’s controversial DMCA subpoenas which demand that smaller ISPs should hand over personal details of their subscribers. Thus far most ISPs have complied, but according to the complaint these requests are a “sham and abuse” of the legal process.

“To identify potential consumers to target, Rightscorp has willfully misused this Court’s subpoena power by issuing at least 142 special DMCA subpoenas, per [the DMCA], to various Internet Service Providers.”

“These subpoenas, which were issued on this Court’s authority, but procured outside of an adversarial proceeding and without any judicial review, are so clearly legally invalid as to be a sham and abuse of the legal process,” the complaint reads.

The above is just a summary of the long list of complaints being brought against Rightscorp. With these settlement practices becoming more common, the case will definitely be one to watch.

Attorney Morgan Pietz is confident that they have a strong case and told FCT that other Rightscorp victims are invited to get in touch.

“We would still be very interested to talking to anyone who was being contacted by Rightscorp or who paid settlements, particularly anyone who was getting the pre-recorded robo-calls,” Pietz said.

For Rightscorp the lawsuit is yet another setback. Earlier this month the piracy monetization firm reported that it continues to turn a loss, which may eventually drive the company towards bankruptcy.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Google Refuses MPAA Request to Blacklist ‘Pirate Site’ Homepages

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

google-bayEvery week copyright holders send millions of DMCA takedown notices to Google, hoping to make pirated movies and music harder to find.

The music industry groups RIAA and BPI are among the most active senders. Together they have targeted more than 170 million URLs in recent years.

The MPAA’s statistics are more modest. Thus far the Hollywood group has asked Google to remove only 19,288 links from search results. The most recent request is one worth highlighting though, as it shows a clear difference of opinion between Hollywood and Google.

Last week the MPAA sent a DMCA request listing 81 allegedly infringing pages, mostly torrent and streaming sites.

Unlike most other copyright holders, the MPAA doesn’t list the URLs where the pirated movies are linked from, but the site’s homepages instead. This is a deliberate strategy, one that previously worked against KickassTorrents.

However, this time around Google was less receptive. As can be seen below most of the MPAA’s takedown requests were denied. In total, Google took “no action” for 60 of the 81 submitted URLs, including casa-cinema.net, freemoviestorrents.com and solarmovie.is.

Part of MPAA’s takedown request
mpaa-takedown-refusal

It’s unclear why Google refused to take action, but it seems likely that the company views the MPAA’s request as too broad. While the sites’ homepages may indirectly link to pirated movies, for most this required more than one click from the homepage.

We previously asked Google under what circumstances a homepage might be removed from search results. A spokesperson couldn’t go into detail but noted that “it’s more complex than simply counting how many clicks one page is from another.”

“We’ve designed a variety of policies to comply with the requirements of the law, while weeding out false positives and material that’s too remote from infringing activity,” Google spokesperson told us.

In this case Google appears to see most reported homepages as not infringing, at least not for the works the MPAA specified.

The MPAA previously said that it would like to move towards blocking pirate sites from search engines entirely, however Google’s recent actions suggest that the company doesn’t want to go this far just yet.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

The Hacker Factor Blog: Lowering The Bar

This post was syndicated from: The Hacker Factor Blog and was written by: The Hacker Factor Blog. Original post: at The Hacker Factor Blog

The Electronic Frontier Foundation (EFF) is one of my favorite non-profit organizations. They have a huge number of attorneys who are ready to help people with issues related to online privacy, copyright, and security. If you’re about to make an 0-day exploit public and receive a legal threat from the software provider, then the EFF should be the first place you go.

The EFF actually provides multiple services. Some are top-notch, but others are not as high quality as they should be. These services include:

Legal Representation
If you need an attorney for an online issue, such as privacy or security, then they can give you direction. When I received a copyright extortion letter from Getty Images, the EFF rounded up four different attorneys who were interested in helping me fight Getty. (Getty Images backed down before I could use these attorneys.) Legal assistance is one of the EFF’s biggest and best offerings.

Legal News
The EFF continually releases news blurbs and whitepapers that discuss current events and their impact on security and privacy. Did you know that U.S. companies supply eavesdropping gear to Central Asian autocrats or that Feds proposed the secret phone database used by local Virginia cops? If you follow the EFF’s news feed, then you saw these reports. As a news aggregation service, their reports are very timely, but also very biased. The EFF’s reporting is biased toward a desire for absolute privacy online, even though nobody’s anonymous online.

Technical Services
The EFF occasionally promotes or releases software designed to assist with online privacy. While these efforts have good intentions, they are typically poorly thought out and can lead to significant problems. For example:

  • HTTPS Everywhere. This browser extension forces your web browser to use HTTPS whenever possible. It has a long set of configuration files that specify which sites should use HTTPS. Earlier this year, I wrote about some of the problems created by this application in “EFF’ing Up“. Specifically: (1) Some sites return different content if you use HTTPS instead of HTTP, (2) they do not appear to test their configuration files prior to releasing them, and (3) they do not fix bad configuration files.

  • TOR. The EFF is a strong supporter of the TOR Project, which consists of a network of servers that help anonymize network connections. The problem is that the EFF wants everyone to run a TOR relay. For a legal organization, the EFF seems to forget that many ISPs forbid end consumers from running public network services — running a TOR relay may violate your ISP’s terms of service. The TOR relay will also slow down your network connection as other people use your bandwidth. (Having other people use your bandwidth is why most consumer-level ISPs forbid users from hosting network services.) And if someone else uses your TOR relay to view child porn, then you are the person that the police will interrogate. In effect, the EFF tells people to run a network service without revealing any of the legal risks.

Free SSL

The EFF recently began promoting a new technical endeavor called Let’s Encrypt. This free CA server should help web sites move to HTTPS. News outlets like Boing Boing, The Register, and ExtremeTech all reported on this news announcement.

A Little Background

Let’s backup a moment… On the web, you can either connect to sites using HTTP or HTTPS. The former (HTTP) is unencrypted. That means anyone watching the network traffic can see what you are doing. The latter (HTTPS) is HTTP over SSL; SSL provides a framework for encrypting network traffic.

But notice how I say “framework”. SSL does not encrypt traffic. Instead, it provides a way for a client (like your web browser) and a server (like a web site) to negotiate how they want to transfer data. If both sides agree on a cryptographic setting, then the data is encrypted.

HTTPS is not a perfect solution. In many cases, it really acts as a security placebo. A user may see that HTTPS is being used, but may not be aware that they are still vulnerable. The initial HTTPS connection can be hijacked (a man-in-the-middle attack) and fake certificates can be issued to phishing servers. Even if the network connection is encrypted, this does nothing to stop the web server from tracking users or providing malware, and nothing to stop vandals from attacking web server. And all of this is before SSL exploits like Heartbleed and POODLE. In general, HTTPS should be considered a “better than nothing” solution. But it is far from perfect.

Entry Requirements

Even with all of the problems associated with SSL and HTTPS, for most uses it is still better than nothing. So why don’t more sites use HTTPS? There’s really a few limitations to entry. The EFF’s “Let’s Encrypt” project is a great solution to one of these problems and a partial solution to another problem. However, it doesn’t address all of the issues, and it is likely to create some new problems that the EFF has not disclosed.

Problem #1: Pay to Play
When an HTTPS client connects to an HTTPS server, the server transmits a server-side certificate as part of the cryptographic negotiation. The client then checks with a third-party certificate authority (CA server) and asks whether the server’s certificate is legitimate. This allows the client to know that the server is actually the correct server.

The server’s certificate identifies the CA network that should be used to verify the certificate. Unfortunately, if the certificate can say where to go to verify it, then bad guys can issue a certificate and tell your browser that it should be verified by a CA server run by the same bad guys. (Yes, fake-bank.com looks like your bank, and their SSL certificate even looks valid, according to fake-ca.com.) For this reason, every web browser ships with a list of known-trusted CA servers. If the CA server is not on the known-list, then it isn’t trusted by default.

If there are any problems with the server’s certificate, then the web browser issues an alert to the user. The problems include outdated/expired certificates, coming from the wrong domain, and untrusted CA servers.

And this is where the first barrier toward wide-spread use comes in… All of those known-trusted CA servers charge a fee. If you want your web server to run with an SSL certificate that won’t generate any user warnings, then you need to pay one of these known-trusted CA servers to issue an SSL certificate for your online service. And if you run multiple services, then you need to pay them multiple times.

The problems should be obvious. Some people don’t have money to pay for the trusted certificate, or they don’t want to spend the money. You can register a domain name for $10 a year, but the SSL certificate will likely run $150 or more. If your site doesn’t need SSL, then you’re not going to pay $150 to require it.

And then there are people like me, who cannot justify paying for a security solution (SSL) that isn’t secure. I cannot justify paying $150 or more, just so web browsers won’t see a certificate warning when they connect to my HTTPS services. (I use self-signed certificates. By themselves, they are untrusted and not secure, but I offer client-side certificates. Virtually no sites use client-side certificates. But client-side certs are what actually makes SSL secure.)

The EFF’s “Let’s Encrypt” project is a free SSL CA server. With this solution, cost is no longer an entry barrier. When their site goes live, I hope to use it for my SSL needs.

Of course, other CA services, like Entrust, Thawte, and GoDaddy, may lower their prices of offer similar free services. (You cannot data-mine users unless they use your service. Even with a “free” pricing model, these CA issuers can still make a hefty profit from collected user data.) As far as the EFF’s offerings go, this is a very disruptive technology for the SSL industry.

Problem #2: Server Installation
Let’s assume that you acquired an SSL certificate from a certificate authority (Thawte, GoDaddy, Let’s Encrypt, etc.). The next step is to install the certificate on your web server.

HTTPS has never been known for its simplicity. Installing the SSL server-side certificate is a nightmare of configuration files and application-specific complexity. Unless you are a hard-core system administrator, then you probably cannot do it. Even GUI interfaces like cPanel have multiple complex steps that are not for non-technies. You, as a user with a web browser, have no idea how much aggravation the system administrator went through in order to provide you with HTTPS and that little lock icon on the address bar. If they are good, then they spent hours. If it was new to them, then it could have been days.

In effect, lots of sites do not run HTTPS because it is overly complicated to install and configure. (And let’s hope that you don’t have to change certificates anytime soon…) Also, HTTPS certificates include an expiration date. This means that there is an ongoing maintenance cost that includes time and effort.

The EFF’s “Let’s Encrypt” solution says that it will include automated management software to help mitigate the installation and maintenance effort. This will probably work if you run one of their supported platforms and have a simple configuration file. But if you’re running a complex system with multiple domains, custom configuration files, and strict maintenance/update procedures, then no script from the EFF will assist you.

Of course, all of this is speculation since the EFF has not announced the supported platforms yet… So far, they have only mentioned a python script for Apache servers. I assume that they mean “Apache2″ and not “Apache”. And even then, the configuration at FotoForensics has been customized for my own needs, so I suspect that their solution won’t work out-of-the-box for my needs.

Problem #3: Client Installation
So… let’s assume that it is past Summer 2015, when Let’s Encrypt becomes available. Let’s also assume that you got the server-side certificate and their automated maintenance script running. You’ve got SSL on your server, HTTPS working, and you’re ready for users. Now everything is about to work without any problems, right? Actually, no.

As pointed out in problem #1, unknown CA servers are not in the user’s list of trusted CA servers. So every browser connecting to one of these web servers will see that ugly alert about an untrusted certificate.

Every user will need to add the new Let’s Encrypt CA servers to their trusted list. And every browser (and almost every version of every browser) does this differently. Making matters worse, lots of mobile devices do not have a way to add new CA servers. It will take years or even decades to fully resolve this problem.

Windows XP reached its “end of life” (again), yet nearly 30% of Windows computers still run XP. IPv6 has been around for nearly 20 years, yet deployment is still at less than 10% for most countries. Getting everyone in the world to update/upgrade is a massive task. It is easier to release a new system than it is to update a deployed product.

The EFF may dream of everyone updating their web browsers, but that’s not the reality. The reality is that users will be quickly trained to ignore any certificate alerts from the web browsers. This opens the door for even more phishing and malware sites. (If the EFF really wanted to solve this problem, then they would phase out the use of SSL and introduce something new.)

There is one other possibility… Along with the EFF, IdenTrust is sponsoring Let’s Encrypt. IdenTrust runs a trusted CA service that issues SSL certificates. (The cost varies from $40 per year for personal use to over $200 per year, depending on various options.) Let’s Encryption could piggy-back off of IdenTrust. This would get past the “untrusted CA service” problem.

But if they did rely on the known-trusted IdenTrust that is already listed in every web browser… the why would anyone buy an SSL certificate from IdenTrust when they can get it for free via Let’s Encrypt? There has to be some catch here. Are they collecting user data? Every browser must verify every server, so whoever runs this free CA server knows when you connected to specific online services — that’s a lot of personal information. Or perhaps they hope to drive sales to their other products. Or maybe there will be a license agreement that prohibits the free service from commercial use. All of this would undermine the entire purpose of trying to protect user’s traffic.

Problem #4: Fake Domains
Phishing web sites, where bad guys impersonate your bank or other online service, have been using SSL certificates for years. They will register a domain like “bankofamerica.fjewahuif.com” and hope that users won’t notice the “fjewahuif” in the hostname. Then they register a real SSL certificate for their “fjewahuif.com” domain. At this point, victims see the “bankofamerica” text in the hostname and they see the valid HTTPS connection and they assume that this is legitimate.

The problem gets even more complicated when they use DNS hijacking. On rare occasions, bad guys have temporarily stolen domains and used to to capture customer information. For example, they could steal the “bankofamerica.com” domain and register a certificate for it at any of the dozens of legitimate CA servers. (If the real Bank of America uses VeriSign, then the fake Bank of America can use Thawte and nobody will notice.) With domain hijacking, it looks completely real but can actually be completely fake.

The price for an SSL certificate used to be a little deterrent. (Most scammers don’t mind paying $10 for a domain and $150 for a legitimate certificate, when the first victim will bring in a few thousands of dollars in stolen money.) But a free SSL CA server? Now there’s no reason not to run this scam. I honestly expect the volume of SSL certificate requests at the EFF’s Let’s Encrypt servers to quickly grow to 50%-80% scam requests. (A non-profit with a legal emphasis that helps scammers? As M. Night Shyamalan says in Robot Chicken: “What a twist!“)

“Free” as in “Still has a lot of work to do before it’s really ready”

The biggest concern that I have with this EFF announcement is that the technology does not exist yet. Their web site says “Arriving Summer 2015” — it’s nearly a year away. While they do have some test code available, their proposed standard is still a draft and they explicitly say to not run the code on any production systems. Until this solidifies into a public release, this is vaporware.

But I do expect this to eventually become a reality. The EFF is not doing this project alone. Let’s Encrypt is also sponsored by Mozilla, Akamai, Cisco, and IdenTrust. These are companies that know browsers, network traffic, and SSL. These are some of the biggest names and they are addressing one of the big problems on today’s Internet. I have no doubt that they are aware of these problems; I just dislike how they failed to disclose these issues when they had their Pollyannaish press release. Just because it is “free” doesn’t mean it won’t have costs for implementation, deployment, maintenance, and customer service. In the open source world, “free” does not mean “without cost”.

Overall, I do like the concept. Let’s Encrypt is intended to make it easier for web services to implement SSL. They will be removing the cost barrier and, in some cases, simplifying maintenance. However, they still face an uphill battle. Users may need to update their web browsers (or replace their old cellphones), steps need to be taken to mitigate scams, users must not be trained to habitually accept invalid certificates, and none of this helps the core issue that HTTPS is a security placebo and not a trustworthy solution. With all of these issues still needing to be addressed, I think that their service announcement a few days ago was a little premature.

TorrentFreak: Luxury Watchmakers Target Pirate Smartwatch Faces

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

rolx-360While digital watches have been becoming more complex in recent years, the advent of a new generation of smartwatches is changing the market significantly. Manufacturers such as Samsung, Sony, Pebble, Motorola and LG all have an interest in the game, with Apple set to show its hand in the early part of 2015.

Currently Android Wear compatible devices such as Motorola’s Moto360 are proving popular, not least due to their ability to display custom watch faces. Fancy Tag Heuer’s latest offering on your wrist? No problem. Rolex? Omega? Cartier? Patek Philippe? All just a click or two away.

Of course, having a digital copy of a watch on one’s wrist is a much cheaper option than the real deal. See that Devon watch fourth from left in the image below? A real-world version will set you back a cool $17,500. The copy? Absolutely free.

watches

While it’s been fun and games for a while, makers of some of the world’s most expensive and well known watches are now targeting sites offering ‘pirate’ smartwatch faces in order to have digital likenesses of their products removed from the market.

TorrentFreak has learned that IWC, Panerai, Omega, Fossil, Armani, Michael Kors, Tissot, Certina, Swatch, Flik Flak and Mondaine are sending cease and desist notices to sites and individuals thought to be offering faces without permission.

Richemont, a company behind several big brands including Cartier, IWC and Panerai, appears to be one of the frontrunners. The company is no stranger to legal action and recently made the headlines after obtaining court orders to have domains selling counterfeit watches blocked at the ISP level in the UK.

Notices seen by TorrentFreak reveal that the company, which made 2.75 billion euros from its watch division during 2012/2013, is lodging notices against watch face sites citing breaches of its trademark rights. Owners are being given 24 hours to remove infringing content.

We discussed the issue with Richemont’s PR representatives but were informed that on this occasion the company could not be reached for comment.

Earlier this week a source informed TF that Swatch-owned Omega had also been busy, targeting a forum with demands that all Omega faces should be removed on “registered trademark, copyright and design rights” grounds. Although the forum would not talk on the record, its operator revealed that the content in question had been removed. Omega did not respond to our requests for comment.

While watchmakers are hardly a traditional foe for those offering digital content, history shows us that they are prepared to act aggressively in the right circumstances.

mondaineMondaine, a Swiss-based company also involved in the latest takedowns, famously found itself in a huge spat with Apple after the company included one of its designs in iOS6. That ended up costing Apple a reported $21 million in licensing fees. The same design is readily available for the Moto360 on various watch face sites.

So how are sites handing the claims of the watchmakers? TorrentFreak spoke with Luke, the operator of leading user-uploaded watch face site FaceRepo. He told us that the site had indeed received takedown notices from brand owners but made it very clear that uploading infringing content is discouraged and steps are being taken to keep it off the site.

“Although some of the replica faces we’ve received take downs for are very cool looking and represent significant artistic talent on the part of the designer, we believe that owners of copyrights or trademarks have the right to defend their brand,” Luke explained.

“If a copyright or trademark owner contacts us, we will promptly remove infringing material. To date, all requests for removal of infringing material have been satisfied within a matter of hours.”

Learning very quickly from other user generated content sites, FaceRepo notifies its users that their content has been flagged as infringing and also deactivates accounts of repeat infringers. A keyword filter has also been introduced which targets well known brands.

“If these [brand names] are found in the face name, description or tags, this will cause the upload to be rejected with a message stating that sharing of copyrighted or trademarked material is prohibited,” FaceRepo’s owner notes.

The development of a new front in the war to keep copyrighted and trademarked content off the Internet is hardly a surprise, and considering their power it comes as no shock that the watchmakers have responded in the way they have. We may be some time from an actual lawsuit targeting digital reproductions of physical content, but as the wearables market develops, one can not rule them out.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Fail: MPAA Makes Legal Content Unfindable In Google

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

wheretowatchThe entertainment industries have gone head to head with Google in recent months, demanding tougher anti-piracy measures from the search engine.

According to the MPAA and others, Google makes it too easy for its users to find pirated content. Instead, they would prefer Google to downrank sites such as The Pirate Bay from its search results or remove them entirely.

A few weeks ago Google took additional steps to decrease the visibility of pirated content, but the major movie studios haven’t been sitting still either.

Last week MPAA announced the launch of WhereToWatch.com, a website that lists where movies and TV-shows can be watched legally.

“WheretoWatch.com offers a simple, streamlined, comprehensive search of legitimate platforms – all in one place. It gives you the high-quality, easy viewing experience you deserve while supporting the hard work and creativity that go into making films and shows,” the MPAA’s Chris Dodd commented.

At first glance WhereToWatch offers a rather impressive database of entertainment content. It even features TorrentFreak TV, although this is listed as “not available” since the MPAA’s service doesn’t index The Pirate Bay.

Overall, however, it’s a decent service. WhereToWatch could also be an ideal platform to beat pirate sites in search results, something the MPAA desperate wants to achieve.

Sadly for the MPAA that is only a “could” since Google and other search engines currently have a hard time indexing the site. As it turns out, the MPAA’s legal platform isn’t designed with even the most basic SEO principles in mind.

For example, if Google visits the movie overview page all links to individual pages are hidden by Javascript, and the search engine only sees this. As a result, movie and TV-show pages in the MPAA’s legal platform are invisible to Google.

Google currently indexes only one movie page, which was most likely indexed through an external link. With Bing the problem is just as bad.

wtw-google

It’s worth noting that WhereToWatch doesn’t block search engines from spidering its content through the robots.txt file. It’s just the coding that makes it impossible for search engines to navigate and index the site.

This is a pretty big mistake, considering that the MPAA repeatedly hammered on Google to feature more legal content. With some proper search engine optimization (SEO) advice they can probably fix the problem in the near future.

Previously Google already offered SEO tips to copyright holders, but it’s obvious that the search engine wasn’t consulted in this project.

To help the MPAA on its way we asked isoHunt founder Gary Fung for some input. Last year Fung lost his case to the MPAA, forcing him to shut down the site, but he was glad to offer assistance nonetheless.

“I suggest MPAA optimize for search engine keywords such as ‘download ‘ and ‘torrent ‘. For some reason when people google for movies, that’s what they actually search for,” Fung tells us.

A pretty clever idea indeed, as the MPAA’s own research shows that pirate-related search terms are often used to “breed” new pirates.

Perhaps it’s an idea for the MPAA to hire Fung or other “industry” experts for some more advice. Or better still, just look at how the popular pirate sites have optimized their sites to do well in search engines, and steal their work.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

TorrentFreak: Swedes Prepare Record File-Sharing Prosecution

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

serversFollowing a lengthy investigation by anti-piracy group Antipiratbyrån, in 2010 police raided a “warez scene” topsite known as Devil. Dozens of servers were seized containing an estimated 250 terabytes of pirate content.

One man was arrested and earlier this year was eventually charged with unlawfully making content available “intentionally or by gross negligence.”

Police say that the man acted “in consultation or concert with other persons, supplied, installed, programmed, maintained, funded and otherwise administered and managed” the file-sharing network from where the infringements were carried out. It’s claimed that the Devil topsite had around 200 members.

All told the man is accused of illegally making available 2,250 mainly Hollywood movies, a record amount according to the prosecutor.

“We have not prosecuted for this many movies in the past. There are many movies and large data set,” says prosecutor Fredrik Ingblad. “It is also the largest analysis of computers ever made in an individual case.”

Few details have been made available on the case but it’s now been revealed that Antipiratbyrån managed to trace the main Devil server back to the data center of a Stockholm-based electronics company. The site’s alleged operator, a man from Väsbybo in his 50s and employee of the company, reportedly admitted being in control of the server.

While it would likely have been the intention of Devil’s operator for the content on the site to remain private, leaks inevitably occurred. Predictably some of that material ended up on public torrent sites, an aggravating factor according to Antipiratbyrån lawyer Henrik Pontén.

“This is a very big issue and it is this type of crime that is the basis for all illegal file sharing. The films available on Pirate Bay circulate from these smaller networks,” Pontén says.

The big question now concerns potential damages. Pontén says that the six main studios behind the case could demand between $673,400 and $2.69m per movie. Multiply that by 2,250 and that’s an astonishing amount, but the lawyer says that in order not to burden the justice system, a few titles could be selected.

Henrik Olsson Lilja, a lawyer representing the defendant, declined to comment in detail but criticized the potential for high damages.

“I want to wait for the trial, but there was no intent in the sense that the prosecutor is looking for,” Lilja told Mitte.se. “In practice, these are American-style punitive damages.”

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Backblaze Blog | The Life of a Cloud Backup Company: Backblaze + Time Machine = ♥

This post was syndicated from: Backblaze Blog | The Life of a Cloud Backup Company and was written by: Yev. Original post: at Backblaze Blog | The Life of a Cloud Backup Company

blog-time-machine

“Why do I need online backup if I have Time Machine Already?” We get that question a lot. Here, we recommend you use both. Backblaze strongly believes in a 3-2-1 backup policy. What’s 3-2-1? Three copies of your data, on two different media, and one copy off-site. If you have that baseline, you’re in good shape. The on-site portions of your backup strategy are typically, the original piece of data, and an external hard drive of some sort. Most of our Mac customers use Time Machine, so that’s the one we’ll focus on here.

Raising Awareness
Apple did a great job with Time Machine, and with building awareness for backups. When you plugged in your first external hard drive, your Mac would ask if you wanted to use that drive as a Time Machine backup drive, which was instrumental in teaching users about the importance and potential ease of backups. It also dramatically simplified data backup, making it automatic and continuous. Apple knew that having people manually drag and drop files into folders and drives so they were backed up was not a reliable backup strategy. By making it automatic, many people used Time Machine for their local backup, but this still left a hole in their backup strategy, they had nothing off-site.

Why Bother
Having an off-site backup comes in handy when your computer and local backup (Time Machine in this case) are both lost. That can occur because of fire, theft, flood, forgetfulness, or a wide variety of other unfortunate reasons. Stories of people neglecting to replace their failed Time Machine drive then having their computer crash are well known. An off-site backup that is current, such as an automatic online backup can also be used to augment the local Time Machine backup, especially when traveling. For example, your hard drive in your laptop crashes while you’re on vacation. Time Machine can be used to recover up to the point where you left for your trip and your online backup can be used to fill in the rest.

Some Limitations
One thing about using Time Machine, is that as a hard drive, it doesn’t scale with the amounts of data that you have. When you purchase a 500GB drive, that’s all the space you have for your backup. For example, if you have your Mac Pro or MacBook and have a Time Machine hard drive connected to it, it will back up the data that’s on the computer. If you add an additional hard drive in to the mix as a storage drive, the Time Machine drive may not be large enough to handle both data sets, from the Mac and from the additional storage. So the more data you accumulate, the larger the Time Machine drive you have to use.

Additionally, if you store data on your Time Machine drive itself, those files are not actually going to be included in the Time Machine backup, so be wary! Apple and Backblaze strongly recommend using a separate, dedicated drive for your Time Machine backup, and not keeping any original data on that drive. That way, if the drive fails, you only lose one copy, and avoid potentially losing both copies. Backblaze works similarly, because you have an off-site backup with Backblaze, it’s another layer of protection from data loss.

Diversification
So use both! And if you’re on a PC, use an external hard drive as your second media type (most come with their own local-backup software). There’s no such thing as too many backups. Backing up is like a retirement or stock portfolio, the more diversification you have, the less vulnerability you have!

Author information

Yev

Yev

Social Marketing Manager at Backblaze

Yev enjoys speed-walking on the beach. Speed-dating. Speed-writing blog posts. The film Speed. Speedy technology. Speedy Gonzales. And Speedos. But mostly technology.

Follow Yev on:

Twitter: @YevP | LinkedIn: Yev Pusin | Google+: Yev Pusin

The post Backblaze + Time Machine = ♥ appeared first on Backblaze Blog | The Life of a Cloud Backup Company.

LWN.net: Introducing AcousticBrainz

This post was syndicated from: LWN.net and was written by: n8willis. Original post: at LWN.net

MusicBrainz, the not-for-profit project that maintains an
assortment of “open content” music metadata databases, has announced
a new effort named AcousticBrainz. AcousticBrainz
is designed to be an open, crowd-sourced database cataloging various
“audio features” of music, including “low-level spectral
information such as tempo, and additional high level descriptors for
genres, moods, keys, scales and much more.
” The data collected
is more comprehensive than MusicBrainz’s existing AcoustID database,
which deals only with acoustic fingerprinting for song recognition.
The new project is a partnership with the Music Technology Group at
Universitat Pompeu Fabra, and uses that group’s free-software toolkit
Essentia to perform its
acoustic analyses. A follow-up
post
digs into the AcousticBrainz analysis of the project’s initial
650,000-track data set, including examinations of genre, mood, key,
and other factors.

LWN.net: Version 2 of the kdbus patches posted

This post was syndicated from: LWN.net and was written by: jake. Original post: at LWN.net

The second version of the kdbus patches have been posted to the Linux kernel mailing list by
Greg Kroah-Hartman. The biggest change since the original patch set (which
we looked at in early November) is that
kdbus now provides a filesystem-based interface (kdbusfs) rather than the
/dev/kdbus device-based interface. There are lots of other
changes in response to v1 review comments as well. “kdbus is a kernel-level IPC implementation that aims for resemblance to
[the] protocol layer with the existing userspace D-Bus daemon while
enabling some features that couldn’t be implemented before in userspace.

TorrentFreak: U.S. Copyright Alert System Security Could Be Improved, Review Finds

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

spyFebruary last year the MPAA, RIAA and five major Internet providers in the United States launched their “six strikes” anti-piracy plan.

The Copyright Alert System’s main goal is to inform subscribers that their Internet connections are being used to share copyrighted material without permission. These alerts start out friendly in tone, but repeat infringers face a temporary disconnection from the Internet or other mitigation measures.

The evidence behind the accusations is provided by MarkMonitor, which monitors BitTorrent users’ activities on copyright holders’ behalf. The overseeing Center for Copyright Information (CCI) previously hired an impartial and independent technology expert to review the system, hoping to gain trust from the public.

Their first pick, Stroz Friedberg, turned out to be not that impartial as the company previously worked as RIAA lobbyists. To correct this unfortunate choice, CCI assigned Professor Avi Rubin of Harbor Labs to re-examine the system.

This week CCI informed us that a summary of Harbor Labs’s findings is now available to the public. The full review is not being published due to the vast amount of confidential information it contains, but the overview of the findings does provide some interesting details.

Overall, Harbor Labs concludes that the evidence gathering system is solid and that false positives, cases where innocent subscribers are accused, are reasonably minimized.

“We conclude, based on our review, that the MarkMonitor AntiPiracy system is designed to ensure that there are no false positives under reasonable and realistic assumptions. Moreover, the system produces thorough case data for alleged infringement tracking.”

However, there is some room for improvement. For example, MarkMonitor could implement additional testing to ensure that false positives and human errors are indeed caught.

“… we believe that the system would benefit from additional testing and that the existing structure leaves open the potential for preventable failures. Additionally, we recommend that certain elements of operational security be enhanced,” Harbor Labs writes.

In addition, the collected evidence may need further protections to ensure that it can’t be tampered with or fall into the wrong hands.

“… we believe that this collected evidence and other potentially sensitive data is not adequately controlled. While MarkMonitor does protect the data from outside parties, its protection against inside threats (e.g., potential rogue employees) is minimal in terms of both policy and technical enforcement.”

The full recommendations as detailed in the report are as follows:

recommendations

The CCI is happy with the new results, which they say confirm the findings of the earlier Stroz Friedberg review.

“The Harbor Labs report reaffirms the findings from our first report – conducted by Stroz Friedberg – that the CAS is well designed and functioning as we hoped,” CCI informs TF.

In the months to come the operators of the Copyright Alert System will continue to work with copyright holders to make further enhancements and modifications to their processes.

“As the CAS exits the initial ramp-up period, CCI has been assured by our content owners that they have taken all recommendations made within both reports into account and are continuing to focus on maintaining the robust system that minimizes false positives and protects customer security and privacy,” CCI adds.

Meanwhile, they will continue to alert Internet subscribers to possible infringements. After nearly two years copyright holders have warned several million users, hoping to convert then to legal alternatives.

Thus far there’s no evidence that Copyright Alerts have had a significant impact on piracy rates. However, the voluntary agreement model is being widely embraced by various stakeholders and similar schemes are in the making in both the UK and Australia.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Krebs on Security: Convicted ID Thief, Tax Fraudster Now Fugitive

This post was syndicated from: Krebs on Security and was written by: BrianKrebs. Original post: at Krebs on Security

In April 2014, this blog featured a story about Lance Ealy, an Ohio man arrested last year for buying Social Security numbers and banking information from an underground identity theft service that relied in part on data obtained through a company owned by big-three credit bureau Experian. Earlier this week, Ealy was convicted of using the data to fraudulently claim tax refunds with the IRS in the names of more than 175 U.S. citizens, but not before he snipped his monitoring anklet and skipped town.

Lance Ealy, in self-portrait he uploaded to twitter before absconding.

Lance Ealy, in selfie he uploaded to Twitter before absconding.

On Nov. 18, a jury in Ohio convicted Ealy, 28, on all 46 charges, including aggravated identity theft, and wire and mail fraud. Government prosecutors presented evidence that Ealy had purchased Social Security numbers and financial data on hundreds of consumers, using an identity theft service called Superget.info (later renamed Findget.me). The jury found that Ealy used that information to fraudulently file at least 179 tax refund requests with the Internal Revenue Service, and to open up bank accounts in other victims’ names — accounts he set up to receive and withdraw tens of thousand of dollars in refund payments from the IRS.

The identity theft service that Ealy used was dismantled in 2013, after investigators with the U.S. Secret Service arrested its proprietor and began tracking and finding many of his customers. Investigators later discovered that the service’s owner had obtained much of the consumer data from data brokers by posing as a private investigator based in the United States.

In reality, the owner of Superget.info was a Vietnamese man paying for his accounts at data brokers using cash wire transfers from a bank in Singapore. Among the companies that Ngo signed up with was Court Ventures, a California company that was bought by credit bureau Experian nine months before the government shut down Superget.info.

Court records show that Ealy went to great lengths to delay his trial, and even reached out to this reporter hoping that I would write about his allegations that everyone from his lawyer to the judge in the case was somehow biased against him or unfit to participate in his trial. Early on, Ealy fired his attorney, and opted to represent himself. When the court appointed him a public defender, Ealy again choose to represent himself.

“Mr. Ealy’s motions were in a lot of respects common delay tactics that defendants use to try to avoid the inevitability of a trial,” said Alex Sistla, an assistant U.S. attorney in Ohio who helped prosecute the case.

Ealy also continued to steal peoples’ identities while he was on trial (although no longer buying from Superget.info), according to the government. His bail was revoked for several months, but in October the judge in the case ordered him released on a surety bond.

It is said that a man who represents himself in court has a fool for a client, and this seems doubly true when facing criminal charges by the U.S. government. Ealy’s trial lasted 11 days, and involved more than 70 witnesses — many of the ID theft victims. His last appearance in court was on Friday. When investigators checked in on Ealy at his home over the weekend, they found his electronic monitoring bracelet but not Ealy.

Ealy faces up to 10 years in prison on each count of possessing 15 or more unauthorized access devices with intent to defraud and using unauthorized access devices to obtain items of $1,000 or more in value; up to five years in prison on each count of filing false claims for income tax refunds with the IRS; up to 20 years in prison on each count of wire fraud and each count of mail fraud; and mandatory two-year sentences on each count of aggravated identity theft that must run consecutive to whatever sentence may ultimately be handed down. Each count of conviction also carries a fine of up to $250,000.

I hope they find Mr. Ealy soon and lock him up for a very long time. Unfortunately, he is one of countless fraudsters perpetrating this costly and disruptive form of identity theft. In 2014, both my sister and I were the victims of tax ID theft, learning that unknown fraudsters had already filed tax refunds in our names when we each filed our taxes with the IRS.

I would advise all U.S. readers to request a tax filing PIN from the IRS (sadly, it turns out that I applied for mine in Feburary, only days after the thieves filed my tax return). If approved, the PIN is required on any tax return filed for that consumer before a return can be accepted. To start the process of applying for a tax return PIN from the IRS, check out the steps at this link. You will almost certainly need to file an IRS form 14039 (PDF), and provide scanned or photocopied records, such a drivers license or passport.

To read more about other ID thieves who were customers of Superget.info that the Secret Service has nabbed and put on trial, check out the stories in this series. Ealy’s account on Twitter is an also an eye-opener.

TorrentFreak: BitTorrent Users are Avid, Eclectic Content Buyers, Survey Finds

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

Each month 150-170 million Internet users share files using the BitTorrent protocol, a massive audience by most standards. The common perception is that these people are only interested in obtaining content for free.

However, studies have found that file-sharers are often more engaged than the average consumer, as much was admitted by the RIAA back in 2012. There’s little doubt that within those millions of sharers lie people spending plenty of money on content and entertainment.

To get a closer look, in September BitTorrent Inc. conducted a survey among a sample of its users. In all, 2,500 people responded and now the company has published the results. The figures aren’t broken down into age groups, but BitTorrent Inc. informs TF that BitTorrent users trend towards young and male.

Music

From its survey the company found that 50% of respondents buy music each month, with a sway towards albums rather than singles (44% v 32%). BitTorrent users are reported as 170% more likely to have paid for a digital music download in the past six months than Joe Public.

Citing figures from the RIAA, BitTorrent Inc. says its users are also 8x more likely than the average Internet user to pay for a streaming music service, with 16% of BitTorrent users and 2% of the general public holding such an account.

Perhaps a little unexpectedly, supposedly tech-savvy torrent users are still buying CDs and vinyl, with 45% and 10% respectively reporting a purchase in the past 12 months. BitTorrent Inc. says that the latter represents users “engaging and unpacking art as a multimedia object”, a clear reference to how the company perceives its BitTorrent Bundles.

On average, BitTorrent Inc. says its user base spends $48 a year on music, with 31% spending more than $100 annually.

bit-music

Movies

When it comes to movies, 47% of respondents said they’d paid for a theater ticket in the preceding 12 months, up on the 38% who purchased a DVD or Blu-ray disc during the same period.

Users with active movie streaming accounts and those making digital movie purchases tied at 23%, with DVD rental (22%) and digital rental (16%) bringing up the rear.

All told, BitTorrent Inc. says that 52% of respondents buy movies on a monthly basis with the average annual spend amounting to $54. More than a third say they spend in excess of $100.

bit-movie

So do the results of the survey suggest that BitTorrent Inc.’s users have a lot to offer the market and if so, what?

“The results confirm what we knew already, that our users are super fans. They are consumers of content and are eager to reward artists for their work,” Christian Averill, BitTorrent Inc.’s Director of Communications, told TF.

“BitTorrent Bundle was started based on this premise and we have more than 10,000 artists now signed up, with more to come. With 90% of purchase going to the content creators, BitTorrent Bundle is the most artist friendly, direct-to-fan distribution platform on the market.”

It seems likely that promoting and shifting Bundles was a major motivator for BitTorrent Inc. to carry out the survey and by showing that torrent users aren’t shy to part with their cash, more artists like Thom Yorke will hopefully be prepared to engage with BitTorrent Inc.’s fanbase.

Also of note is the way BitTorrent Inc. is trying to position that fanbase or, indeed, how that fanbase has positioned itself. While rock (20%), electronic (15%) and pop (13%) took the top spots in terms of genre popularity among users, 23% described their tastes as a vague “other”. Overall, 61% of respondents described their musical tastes as “eclectic”.

“[Our] users are engaged in the creative community and they have diverse taste. They also do not define themselves by traditional genres. We feel this is a true representation about how fans view themselves universally these days. They are eclectic,” Averill concludes.

While monetizing content remains a key focus for BitTorrent Inc., the company is also making strides towards monetizing its distribution tools. Last evening uTorrent Plus was replaced by uTorrent Pro (Windows), an upgraded client offering torrent streaming, an inbuilt player, video file converter and anti-virus features. The ad-free client (more details here) is available for $19.95 per year.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Linux How-Tos and Linux Tutorials: Beginning Git and Github for Linux Users

This post was syndicated from: Linux How-Tos and Linux Tutorials and was written by: Carla Schroder. Original post: at Linux How-Tos and Linux Tutorials

fig-1 github

The Git distributed revision control system is a sweet step up from Subversion, CVS, Mercurial, and all those others we’ve tried and made do with. It’s great for distributed development, when you have multiple contributors working on the same project, and it is excellent for safely trying out all kinds of crazy changes. We’re going to use a free Github account for practice so we can jump right in and start doing stuff.

Conceptually Git is different from other revision control systems. Older RCS tracked changes to files, which you can see when you poke around in their configuration files. Git’s approach is more like filesystem snapshots, where each commit or saved state is a complete snapshot rather than a file full of diffs. Git is space-efficient because it stores only changes in each snapshot, and links to unchanged files. All changes are checksummed, so you are assured of data integrity, and always being able to reverse changes.

Git is very fast, because your work is all done on your local PC and then pushed to a remote repository. This makes everything you do totally safe, because nothing affects the remote repo until you push changes to it. And even then you have one more failsafe: branches. Git’s branching system is brilliant. Create a branch from your master branch, perform all manner of awful experiments, and then nuke it or push it upstream. When it’s upstream other contributors can work on it, or you can create a pull request to have it reviewed, and then after it passes muster merge it into the master branch.

So what if, after all this caution, it still blows up the master branch? No worries, because you can revert your merge.

Practice on Github

The quickest way to get some good hands-on Git practice is by opening a free Github account. Figure 1 shows my Github testbed, named playground. New Github accounts come with a prefab repo populated by a README file, license, and buttons for quickly creating bug reports, pull requests, Wikis, and other useful features.

Free Github accounts only allow public repositories. This allows anyone to see and download your files. However, no one can make commits unless they have a Github account and you have approved them as a collaborator. If you want a private repo hidden from the world you need a paid membership. Seven bucks a month gives you five private repos, and unlimited public repos with unlimited contributors.

Github kindly provides copy-and-paste URLs for cloning repositories. So you can create a directory on your computer for your repository, and then clone into it:

$ mkdir git-repos
$ cd git-repos
$ git clone https://github.com/AlracWebmaven/playground.git
Cloning into 'playground'...
remote: Counting objects: 4, done.
remote: Compressing objects: 100% (4/4), done.
remote: Total 4 (delta 0), reused 0 (delta 0)
Unpacking objects: 100% (4/4), done.
Checking connectivity... done.
$ ls playground/
LICENSE  README.md

All the files are copied to your computer, and you can read, edit, and delete them just like any other file. Let’s improve README.md and learn the wonderfulness of Git branching.

Branching

Git branches are gloriously excellent for safely making and testing changes. You can create and destroy them all you want. Let’s make one for editing README.md:

$ cd playground
$ git checkout -b test
Switched to a new branch 'test'

Run git status to see where you are:

$ git status
On branch test
nothing to commit, working directory clean

What branches have you created?

$ git branch
* test
  master

The asterisk indicates which branch you are on. master is your main branch, the one you never want to make any changes to until they have been tested in a branch. Now make some changes to README.md, and then check your status again:

$ git status
On branch test
Changes not staged for commit:
  (use "git add ..." to update what will be committed)
  (use "git checkout -- ..." to discard changes in working directory)
        modified:   README.md
no changes added to commit (use "git add" and/or "git commit -a")

Isn’t that nice, Git tells you what is going on, and gives hints. To discard your changes, run

$ git checkout README.md

Or you can delete the whole branch:

$ git checkout master
$ git branch -D test

Or you can have Git track the file:

$ git add README.md
$ git status
On branch test
Changes to be committed:
  (use "git reset HEAD ..." to unstage)
        modified:   README.md

At this stage Git is tracking README.md, and it is available to all of your branches. Git gives you a helpful hint– if you change your mind and don’t want Git to track this file, run git reset HEAD README.md. This, and all Git activity, is tracked in the .git directory in your repository. Everything is in plain text files: files, checksums, which user did what, remote and local repos– everything.

What if you have multiple files to add? You can list each one, for example git add file1 file2 file2, or add all files with git add *.

When there are deleted files, you can use git rm filename, which only un-stages them from Git and does not delete them from your system. If you have a lot of deleted files, use git add -u.

Committing Files

Now let’s commit our changed file. This adds it to our branch and it is no longer available to other branches:

$ git commit README.md
[test 5badf67] changes to readme
 1 file changed, 1 insertion(+)

You’ll be asked to supply a commit message. It is a good practice to make your commit messages detailed and specific, but for now we’re not going to be too fussy. Now your edited file has been committed to the branch test. It has not been merged with master or pushed upstream; it’s just sitting there. This is a good stopping point if you need to go do something else.

What if you have multiple files to commit? You can commit specific files, or all available files:

$ git commit file1 file2
$ git commit -a

How do you know which commits have not yet been pushed upstream, but are still sitting in branches? git status won’t tell you, so use this command:

$ git log --branches --not --remotes
commit 5badf677c55d0c53ca13d9753344a2a71de03199
Author: Carla Schroder 
Date:   Thu Nov 20 10:19:38 2014 -0800
    changes to readme

This lists un-merged commits, and when it returns nothing then all commits have been pushed upstream. Now let’s push this commit upstream:

$ git push origin test
Counting objects: 7, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 324 bytes | 0 bytes/s, done.
Total 3 (delta 1), reused 0 (delta 0)
To https://github.com/AlracWebmaven/playground.git
 * [new branch]      test -> test

You may be asked for your Github login credentials. Git caches them for 15 minutes, and you can change this. This example sets the cache at two hours:

$ git config --global credential.helper 'cache --timeout=7200'

Now go to Github and look at your new branch. Github lists all of your branches, and you can preview your files in the different branches (figure 2).

fig-2 github

Now you can create a pull request by clicking the Compare & Pull Request button. This gives you another chance to review your changes before merging with master. You can also generate pull requests from the command line on your computer, but it’s rather a cumbersome process, to the point that you can find all kinds of tools for easing the process all over the Web. So, for now, we’ll use the nice clicky Github buttons.

Github lets you view your files in plain text, and it also supports many markup languages so you can see a generated preview. At this point you can push more changes in the same branch. You can also make edits directly on Github, but when you do this you’ll get conflicts between the online version and your local version. When you are satisfied with your changes, click the Merge pull request button. You’ll have to click twice. Github automatically examines your pull request to see if it can be merged cleanly, and if there are conflicts you’ll have to fix them.

Another nice Github feature is when you have multiple branches, you can choose which one to merge into by clicking the Edit button at the right of the branches list (figure 3).

fig-3 github

After you have merged, click the Delete Branch button to keep everything tidy. Then on your local computer, delete the branch by first pulling the changes to master, and then you can delete your branch without Git complaining:

$ git checkout master
$ git pull origin master
$ git branch -d test

You can force-delete a branch with an uppercase -D:

$ git branch -D test

Reverting Changes

Again, the Github pointy-clicky way is easiest. It shows you a list of all changes, and you can revert any of them by clicking the appropriate button. You can even restore deleted branches.

You can also do all of these tasks exclusively from your command line, which is a great topic for another day because it’s complex. For an exhaustive Git tutorial try the free Git book, and you can test everything with your Github account.

TorrentFreak: Torrents Good For a Third of all Internet Traffic in Asia-Pacific

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

download-keyboardOver the years we have been following various reports on changes in Internet traffic, specifically in relation to torrents.

One of the patterns that emerged with the rise of video streaming services is that BitTorrent is losing its share of total Internet traffic, in North America at least, where good legal services are available.

This downward spiral is confirmed by the latest report from Sandvine which reveals that torrent traffic is now responsible for ‘only’ 5% of all U.S. Internet traffic in North America during peak hours, compared to 10.3% last year.

In other countries, however, this decrease is not clearly visible. In Europe, for example, the percentage of Internet traffic during peak hours has remained stable over the past two years at roughly 15%, while absolute traffic increased during the same period.

In Asia-Pacific BitTorrent traffic there’s yet another trend. Here, torrents are booming with BitTorrent traffic increasing more than 50% over the past year.

asia-pacific

According to Sandvine torrents now account for 32% of all traffic during peak hours, up from 21%. Since overall traffic use also increased during the same period, absolute traffic has more than doubled.

Looking at upstream data alone torrents are good for more than 55% of all traffic during peak hours.

One of the countries where unauthorized BitTorrent usage has been growing in recent years is Australia, which has one of the highest piracy rates in the world.

There are several reasons why torrents are growing in popularity, but the lack of good legal alternatives is expected to play an important role. It’s often hard or expensive to get access to the latest movies and TV-shows in this region.

It will be interesting to see whether this trend will reverse during the coming years as more legal services come online. Netflix’ arrival in Australia next year, for example, is bound to shake things up.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

SANS Internet Storm Center, InfoCON: green: Critical WordPress XSS Update, (Thu, Nov 20th)

This post was syndicated from: SANS Internet Storm Center, InfoCON: green and was written by: SANS Internet Storm Center, InfoCON: green. Original post: at SANS Internet Storm Center, InfoCON: green

Today, WordPress4.0.1 was released, which addresses a critical XSS vulnerability (among other vulnerabilities). [1]

The XSS vulnerability deserves a bit more attention, as it is an all too common problem, and often underestimated. First of all, why is XSS Critical? It doesnt allow direct data access like SQL Injection, and it doesnt allow code execution on the server. Or does it?

XSS does allow an attacker to modify the HTML of the site. With that, the attacker can easily modify form tags (think about the login form, changing the URL it submits its data to) or the attacker could use XMLHTTPRequest to conduct CSRF without being limited by same origin policy. The attacker will know what you type, and will be able to change what you type, so in short: The attacker is in full control. This is why XSS is happening.

The particular issue here was that WordPress allows some limited HTML tags in comments. This is always a very dangerous undertaking. The word press developers did attempt to implement the necessary safeguards. Only certain tags are allowed, and even for these tags, the code checked for unsafe attributes. Sadly, this check wasnt done quite right. Remember that browsers will also parse somewhat malformed HTML just fine.

A better solution would have probably been to use a standard library instead of trying to do this themselves. HTML Purifier is one such library for PHP. Many developer shy away from using it as it is pretty bulky. But it is bulky for a reason: it does try to cover a lot of ground. It not only normalizes HTML and eliminates malformed HTML, but it also provides a rather flexible configuration file. Many lightweight alternatives, like the solution WordPress came up with, rely on regular expressions. Regular expressions are typically not the right tool to parse HTML. Too much can go wrong starting from new lines and ending somewhere around multi-bytecharacters. In short: Dont use regular expressions to parse HTML (or XML), in particular for security.

[1] https://wordpress.org/news/2014/11/wordpress-4-0-1/


Johannes B. Ullrich, Ph.D.
STI|Twitter|LinkedIn

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.

Raspberry Pi: Northern Ireland’s first Raspberry Jams

This post was syndicated from: Raspberry Pi and was written by: Liz Upton. Original post: at Raspberry Pi

Liz: Andrew Mulholland is a first-year undergraduate student at Queen’s College Belfast, and the overall winner of 2014’s Talk Talk Digital Hero award. We’ve known him for a few years (he did work experience with us this summer – he created the Grandpa Scarer learning resource for us with Matt Timmons-Brown).

Andrew’s been setting up events to introduce other young people to computing for some years now. He‘s recently been running the very first Raspberry Jams in Northern Ireland, and is doing a lot of computing outreach with local schools. I asked him how the kids who’d attended the Jams had found the experience, and he sent me the blog post below. Well done Andrew – it’s brilliant to see how much fun an introduction to computing can be. You’re doing an amazing job.

Northern Ireland November Raspberry Jam

On Saturday 8th November 20+ soon-to-be Raspberry Pi enthusiasts arrived at Farset Labs for the 6th Northern Ireland Raspberry Jam.

farsettjam

September, NI Raspberry Jam 5

This months main activities? Sonic Pi 2 and Minecraft Pi!

At the Jam we also have all the previous months’ activities printed out, so that if the kids want to try something else out, they are more than welcome to.

There are activities ranging from Sonic Pi, to Minecraft Pi, to physical computing projects like creating a reaction timer game in Scratch GPIO, along with quite a few others.

asd

Lots of cool stuff to play with!

I asked a few of the kids at the jam to write down what they though.

haley

Haley (11) having way too much fun hacking someone else’s Minecraft Pi game!

Haley:

“It was my first Raspberry Jam and I was quite nervous when I walked in but one of the mentors came over and introduced himself to me and explained what we would be getting up to. He found me a chair and showed me how to connect all the wires together and by the end of the Jam I was laughing my head off! I really enjoyed learning how to make music using Sonic Pi. I made the tune Frère Jacques. My favourite part was learning how to code while playing Minecraft. Andrew told me I should learn how to code because I had never done it before. I used a programming language called Python to hack others Minecraft games and to teleport them to a random place. I heard another kid start exclaiming after teleporting her several times, initially she had no idea it was me! Andrew and Libby were very supportive the whole day and I learnt a massive amount thanks to them. It was great fun!”

Apparently Haley enjoyed her first Raspberry Jam.

Apparently Haley enjoyed her first Raspberry Jam.

 Katie:

“I heard about the Raspberry Jam because one of the mentors volunteers at my school and the Jam was announced in Assembly as part of EU Coding Week. My friend Rachel and I decided to give it a go. I didn’t know anything about a Raspberry Pi and had no idea what to expect before I went but Andrew and the mentors have taught me loads and are very encouraging. I have just done my second Raspberry Jam and I loved it! I created a piece of music using Sonic Pi, played/hacked Minecraft and played with an LEDBorg in Scratch GPIO! Also we got doughnuts and got to make use of Farset Lab’s huge blackboard! It is the biggest blackboard I’ve ever seen. I don’t have a favorite part because everything I did was great fun and everybody was helpful. I definitely suggest anyone my age giving it a go!”

Rachel and Katie creating music with Sonic-Pi 2

Rachel and Katie creating music with Sonic-Pi 2

Rachel

“I had a great time at my second Raspberry Jam at the weekend. The thing I enjoyed the most was learning with Scratch with the GPIO pins. This is something my school doesn’t teach so I don’t get the chance to do anything like this normally. It was great fun programming the LEDs to change different colours using a program I wrote.

The Raspberry Jam is such an amazing workshop and I am very grateful to Andrew and Libby for running it! I can’t wait till the December Jam!!”

We didn’t just have young people at the NI Raspberry Jam this month! The Jam is open to people of all ages, coding knowledge and backgrounds.

Never to old to play Minecraft! John (70) getting taught how to play Minecraft Pi by Isaac (10)

Never to old to play Minecraft! John (70) getting taught how to play Minecraft Pi by Isaac (10)

A parent:

“These events are really great. It lets the kids experiment with technology that they wouldn’t otherwise have got the opportunity to use in school. Most schools in Northern Ireland don’t seem to offer any coding opportunities for the kids so stuff like this is essential. And Andrew and Libby are great, giving up their Saturdays to come and teach these kids and my son!”

Next month is the Christmas special Jam! We have some secret new activities planned and of course, lots of food!

Some awesome cupcakes baked by @baker_geek for last months Jam.

Some awesome cupcakes baked by @baker_geek for last months Jam.

Want to come along to the next NI Raspberry Jam?

Northern Ireland Raspberry Jam is on the 2nd Saturday of every month with NI Raspberry Jam 7 (Christmas special) being on the 12th December at Farset Labs, Belfast.

Tickets are free! (Although we ask for a £3 donation towards the venue if able to).

The event is especially aimed at complete beginners to the Raspberry Pi or people just starting out, but we do have some more complex projects and challenges for you if you are an expert.

Special thanks to Libby (16) for helping me with this months Jam, and to Farset Labs for basically letting us take over the building for a Saturday afternoon!

You know when you are onto something good when you overhear one of the kids on their way out saying: “Daddy, daddy, can I borrow your phone to book next month’s tickets before they all go?”

Interested in finding a Raspberry Jam near you? Check out our Jams page!

TorrentFreak: U.S. Brands Kim Dotcom a Fugitive, ‘Spies’ on Others

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

megaupload-logoIt’s been nearly three years since Megaupload was taken down by the U.S. authorities but it’s still uncertain whether Kim Dotcom and his fellow defendants will be extradited overseas.

Two months ago the U.S. Government launched a separate civil action in which it asked the court to forfeit the bank accounts, cars and other seized possessions of the Megaupload defendants, claiming they were obtained through copyright and money laundering crimes.

Megaupload responded to these allegations at the federal court in Virginia with a motion to dismiss the complaint. According to Megaupload’s lawyers the U.S. Department of Justice (DoJ) is making up crimes that don’t exist.

In addition, Dotcom and his co-defendants claimed ownership of the assets U.S. authorities are trying to get their hands on. A few days ago the DoJ responded to these claims, arguing that they should be struck from the record as Dotcom and his colleagues are fugitives.

In a motion (pdf) submitted to a Virginia District Court the U.S. asks for the claims of the defendants to be disregarded based on the doctrine of fugitive disentitlement.

“Claimants Bram van der Kolk, Finn Batato, Julius Bencko, Kim Dotcom, Mathias Ortmann, and Sven Echternach, are deliberately avoiding prosecution by declining to enter the United States where the criminal case is pending,” U.S. Attorney Dana Boente writes.

“The key issue in determining whether a person is a fugitive from justice is that person’s intent. A defendant who flees with intent to avoid arrest is a fugitive from justice,” he adds.

Since Kim Dotcom and his New Zealand-based Megaupload colleagues are actively fighting their extradition they should be seen as fugitives, the DoJ concludes.

“Those claimants who are fighting extradition on the criminal charges in the related criminal case, claimants van der Kolk, Batato, Kim Dotcom, and Ortmann, are fugitives within the meaning of the statute, regardless of the reason for their opposition.”

Megaupload lawyer Ira Rothken disagrees with this line of reasoning. He told TF that the fugitive disentitlement doctrine shouldn’t apply here.

“The DOJ is trying to win the Megaupload case on procedure rather than the merits. Most people don’t realize that Kim Dotcom has never been to the United States,” Rothken says.

A person who has never been to the United States and is currently going through a lawful procedure in New Zealand shouldn’t be seen as a fugitive, according to Rothken.

The recent DoJ filing also highlights another aspect of the case. According to a declaration by special FBI agent Rodney Hays, the feds have obtained “online conversations” of Julius Bencko and Sven Echternach, the two defendants who currently reside in Europe.

These conversations were obtained by law enforcement officers and show that the authorities were ‘spying’ on some of the defendants months after Megaupload was raided.

tapped

“During a conversation that occurred on or about March 28, 2012, Bencko allegedly told a third-party, ‘I can come to Bratislava [Slovakia] if needed .. bu [sic] you know .. rather not travel around much .. ‘ Later in the conversation, Bencko states ‘i’m facing 55 years in usa’,” the declaration reads.

In addition to the two defendants, law enforcement also obtained a conversation of Kim’s wife Mona Dotcom, who is not a party in the case herself.

“During a conversation that occurred on or about February 9, 2012 a third-party told Mona Dotcom, ‘Also Julius [Bencko] wants Kim [Dotcom] to know that he will be supportive in what ever way possible that he needs’,”

According to the U.S. the ‘tapped’ conversations of Bencko and Echternach show that since they are avoiding travel to the United States, they too can be labeled fugitives.

It’s unclear how the online conversations were obtained, but Megaupload lawyer Ira Rothken told TF that he wouldn’t be surprised if civil liberties were violated in the process, as has happened before in the case.

Whether these fugitive arguments will be accepted by the court has yet to be seen. Highlighting the motion Megaupload submitted earlier, Rothken notes that regardless of these arguments the case should be dismissed because the court lacks jurisdiction.

“The United States doesn’t have a stature for criminal copyright infringement,” Rothken tells us. “We believe that the case should be dismissed based on a lack of subject matter jurisdiction.”

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Backblaze Blog | The Life of a Cloud Backup Company: There’s No I in Bryan

This post was syndicated from: Backblaze Blog | The Life of a Cloud Backup Company and was written by: Yev. Original post: at Backblaze Blog | The Life of a Cloud Backup Company

blog-bryan
Straight out of Portland, Bryan joins our Datacenter staff to help backup your world! Having had a wide variety of jobs before joining the Backblaze team, including farming and store clerking, Bryan is excited to join the tech industry, and can’t wait to help ensure your data is safe. Let’s learn some more about our fourth, and latest “Brian”!

What is your Backblaze Title?
Datacenter Technician

Where are you originally from?
Before Sacramento, I lived in Portland, Oregon. Before that, I called upstate New York “home”.

Why did you move to Sacramento?
I moved to California to help backup your world!

What attracted you to Backblaze?
I’ve lost data before and it’s horrible. I like knowing that my stuff is backed up securely, and I’d like to help other people know their stuff is backed up too. Backblaze is the place to do this.

From the outside, Backblaze struck me as inventive and ambitious, and the data center work looked like it would switch from thinking/planning to moving/doing and back again throughout the day at a good clip. I’ve been here for a week, and it really does function that way. I love it.

Where else have you worked?
Farms, video rental stores, gas stations, radio waves, computer stores, and offices. You know, the usual.

Tell us how you currently backup your photos, music, data, etc. on your home computer?
Local backups: Time Machine
Bootable backups: Shirt-Pocket’s Super Duper! and Bombich’s Carbon Copy Cloner
Offsite backups: Backblaze

If you won the lottery tomorrow, what would you do?
I would buy you lunch!

How did you get into computers?
In sixth grade when I was 12, my grandparents bought a Packard Bell so they could make spreadsheets tracking their stats in fantasy NASCAR. Every day after school I pedaled my bicycle to their house along ATV trails through the forest, so that I could use the computer. Eventually I was given someone’s used computer. I still visited my grandparents though.

Welcome Bryan! We’re jazzed to have you on board, and will definitely look forward to that lunch after you hit it big withe lotto!

Author information

Yev

Yev

Social Marketing Manager at Backblaze

Yev enjoys speed-walking on the beach. Speed-dating. Speed-writing blog posts. The film Speed. Speedy technology. Speedy Gonzales. And Speedos. But mostly technology.

Follow Yev on:

Twitter: @YevP | LinkedIn: Yev Pusin | Google+: Yev Pusin

The post There’s No I in Bryan appeared first on Backblaze Blog | The Life of a Cloud Backup Company.

SANS Internet Storm Center, InfoCON: green: “Big Data” Needs a Trip to the Security Chiropracter!, (Wed, Nov 19th)

This post was syndicated from: SANS Internet Storm Center, InfoCON: green and was written by: SANS Internet Storm Center, InfoCON: green. Original post: at SANS Internet Storm Center, InfoCON: green

When the fine folks at Portswigger updated Burp Suite last month to 1.6.07 (Nov 3), I was really glad to see NoSQL injection in the list of new features.

Whats NoSQL you ask? If your director is talking to you about Big Data or your Marketing is talking to you about customer metrics, likely what they mean is an app with a back-end database that uses NoSQL instead of real SQL.

Im tripping over this requirement this month in the retail space. Ive got clients that want to track a retail customers visit to the store (tracking their cellphones using the store wireless access points), to see:

  • if customers visit store sections where the sale items are?
  • or, if customers visit area x, do they statistically visit area y next?
  • or, having visited the above areas, how many customers actually purchase something?
  • or, after seeing a purchase, how many feature sale purchases are net-new customers (or repeat customers)

In other words, using the wireless system to track customer movements, then correlating it back to purchase behaviour to determine how effective each feature sale might be.

So what database do folks use for applications like this? Front-runners in the NoSQL race these days include MongoDB and CouchDB. Both databases do cool things with large volumes of data.”>Ensure that MongoDB runs in a trusted network environment and limit the interfaces on which MongoDB instances listen for incoming connections. Allow only trusted clients to access the network interfaces and ports on which MongoDB instances are available.

CouchDB has a similar statement at http://guide.couchdb.org/draft/security.html “>it should be obvious that putting a default installation into the wild is adventurous

So, where do I see folks deploying these databases? Why, in PUBLIC CLOUDs, thats where!” />

And what happens after you stand up your almost-free database and the analysis on that dataset is done? In most cases, the marketing folks who are using it simply abandon it, in a running state. What could possibly go wrong with that? Especially if they didnt tell anyone in either the IT or Security group that this database even existed?

Given that weve got hundreds of new ways to collect data that weve never had access to before, its pretty obvious that if big data infrastructures like these arent part of our current plans, they likely should be. All I ask is that folks do the risk assessments tha they would if this server was going up in their own datacenter. Ask some questions like:

  • What data will be on this server?
  • Who is the formal custodian of that data?
  • Is the data covered under a regulatory framework such as HIPAA or PCI? Do we need to host it inside of a specific zone or vlan?
  • What happens if this server is compromised? Will we need to disclose to anyone?
  • Who owns the operation of the server?
  • Who is responsible for securing the server?
  • Does the server have a pre-determined lifetime? Should it be deleted after some point?
  • Is the developer or marketing team thats looking at the dataset understand your regulatory requirements? Do they understand that Credit Card numbers and Patient Data are likely bad candidates for an off-prem / casual treatment like this (hint – NO THEY DO NOT).

Smartmeter applications are another big data thing Ive come across lately. Laying this out end-to-end – collecting data from hundreds of thousands of embedded devices that may or may not be securable, over a public network to be stored in an insecurable database in a public cloud. Oh, and the collected data impinges on at least 2 regulatory frameworks – PCI and NERC/FERC, possibly also privacy legislation depending on the country. Ouch!

Back to the tools to assess these databases – Burp isnt your only option to scan NoSQL database servers – in fact, Burp is more concerned with the web front-end to NoSQL itself. NoSQLMAP (http://www.nosqlmap.net/) is another tool thats seeing a lot of traction, and of course the standard usual suspects list of tools have NoSQL scripts, components and plugins – Nessus has a nice set of compliance checks for the database itself, NMAP has scripts for both couchdb, mongodbb and hadoop detection, as well as mining for database-specific information. OWASP has a good page on NoSQL injection at https://www.owasp.org/index.php/Testing_for_NoSQL_injection, and also check out http://opensecurity.in/nosql-exploitation-framework/.

Shodan is also a nice place to look in an assessment during your recon phase (for instance, take a look at http://www.shodanhq.com/search?q=MongoDB+Server+Information )

Have you used a different tool to assess a NoSQL Database? Or have you had – lets say an interesting conversation around securing data in such a database with your management or marketing group? Please, add to the story in our comment form!

===============
Rob VandenBrink
Metafore

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.

TorrentFreak: Artists and Labels Now Sue Chrysler Over CD-Ripping Cars

This post was syndicated from: TorrentFreak and was written by: Ernesto. Original post: at TorrentFreak

ripping-carToward the end of the last century record labels feared that home taping would kill the music industry.

To counter the threat cassette tape recorders posed at the time, they asked Congress to take action.

This eventually resulted in the Audio Home Recording Act (AHRA) of 1992. Under this law importers and manufacturers must pay royalties on “digital audio recording devices,” among other things.

The legislation is still in play today. Instead of targeting cassette recorders, however, the threats are now other copying devices. According to the Alliance of Artists and Recording Companies, this includes media entertainment systems that are built into many cars.

This week the music group, which lists major record labels and 300,000 artists among its members, sued Chrysler and its technology partner Mitsubishi (pdf) for failing to pay royalties.

The dispute revolves around Chrysler’s media entertainment systems including “MyGIG” and “Uconnect Media Center” which allow car owners to rip CDs to a hard drive.

“These devices are covered by the AHRA, but the defendants have refused to pay royalties on them or include the required serial copy protections,” AARC Executive Director Linda Bocchi comments.

The music group reached out to Chrysler and Mitsubishi hoping to settle the issue, but these talks failed. As a result AARC saw no other option than to take the case to court.

“We had hoped Chrysler and the Mitsubishi Electric companies would settle their liability and begin paying what they owe once they had an opportunity to study and assess the issues,” Bocchi says.

“But it has now become painfully clear they have no intention of complying with the law. While litigation is always a last resort, it is clear this lawsuit is the only way to protect our members’ rights.”

The current lawsuit follows an earlier case against Ford and General Motors, which is still ongoing.

In both cases artists and record labels are looking for statutory damages, which could amount to hundreds of millions of dollars. In addition, they want to prevent the manufacturers from selling these unauthorized devices in their cars.

Ford has already filed a motion to dismiss arguing that AHRA doesn’t apply to their systems, and the other defendants including Chrysler are likely to do the same. Whose side the court will agree with is expected to become clear in the months to come.

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.

Raspberry Pi: A collection of Pis

This post was syndicated from: Raspberry Pi and was written by: Liz Upton. Original post: at Raspberry Pi

Liz: Today’s guest post comes from Alex Eames, who runs the rather wonderful RasPi.TV. He’s been furtling through his drawers, and has discovered he owns a surprising number of Raspberry Pi variants. Thanks Alex! 

Now we have the A+, I thought it’d be a good time to celebrate its ‘birth’ by having a rundown of the various mass-produced models of Raspberry Pi.

I had a look through my collection and was somewhat surprised to see that I have 10 different variants of Raspberry Pi now. There is one I don’t have, but more about that later. Here’s the family photo. You can click it for a higher resolution version.

Raspberry_Pi_Family_A-annotated-15001

Rev 1 Model B

In row 1, column 1 we have the Rev 1 model B. Although I was up early on 29th February 2012, I didn’t get one of the first 10,000 Pis produced. This was delivered in May 2012. It’s a Farnell variant (I have an RS one as well, but it does full-time duty as my weather station). This was the original type of Pi to hit the market. It has 256 Mb RAM and polyfuses on the USB.

Rev 1 Model B – With Links

In row 1, column 2 you’ll see a slightly later variant of Rev 1 model B. This one has 0 Ohm links instead of polyfuses. It helped to overcome some of the voltage drop issues associated with the original Rev 1, but it introduced the “hot-swapping USB devices will now reboot your Pi” issue, which was fixed in the B+.

Rev 2 Model B (China)

Row 2, column 1. Here we have an early Rev 2 Pi. This one was manufactured in China. It originally had a sticker on saying “made in China”, but I took it off. This one was bought some time around October 2012. The Rev 2 model B has 512 Mb RAM (apart from a few early ones which had 256 Mb), mounting holes and two headers called P5 and P6.

Rev 2 Model B (UK)

Row 2, column 2. This is a much later Rev 2 Pi, made at SONY in Wales, UK.

Chinese Red Pi Rev 2 Model B

Row 3, column 1. This is one of the Red Pis made especially for the Chinese market. They are not allowed to be sold in the UK, but if you import one yourself that’s not a problem. It is manufactured to a less stringent spec than the ones at SONY, and is not EMC tested. Therefore it bears no CE/FCC marks.

Limited Edition Blue Pi Rev 2 Model B

Row 3, column 2. I’m not going to go into how I got hold of this. Suffice it to say it was not at all easy, but no laws were broken, and nobody got hurt. RS had 1000 of these made in March 2013 as a special limited anniversary edition to use as prizes and awards to people who’ve made a special contribution to education etc. I know of about 5 or 6 people who have them. (At least two of those people traded for them.) They are extremely hard to get. They come in a presentation box with a certificate. I have #0041. Other than their blueness, they are a Rev 2 model B Pi.

Model A

Row 1, Column 3 is a model A. The PCB is identical to the Rev 2 model B, but it has only one USB port, no ethernet port, no USB/ethernet chip and 256 Mb RAM. The $25 model A was released in February 2013. On the day I got mine, the day after launch, I made a quick and dirty “I’ve got mine first” video, part of which ended up on BBC Click. The model A sold about 100k units. Demand for it was outstripped by the model B, although at one point CPC was offering a brilliant deal on a camera module and model A for £25 (I snagged a couple of those).

Compute Module

Row 2, column 3 is the Compute Module, sitting atop the Compute Module development board. This was launched 23 June 2014 as a way to enable industrial use of the Pi in a more convenient form factor. The module is made so it fits in a SODIMM connector and is essentially the BCM 2835, its 512 Mb RAM and 4 Gb of eMMC flash memory with all available GPIO ports broken out. It costs $30 when bought by the hundred.

Model B+

Row 3, column 3 is the model B+. This was launched on 14 July 2014 and was a major change in form factor. Rounded corners, corner mount holes, 40 GPIO pins, 4 USB ports, improved power circuitry and a complete layout redesign. The B+ was announced as the ‘final revision’ of the B. So it would appear that it’s going to be with us for some time.

Model A+

In row 4, all by itself we have the shiny new Raspberry Pi A+, launched 10 November 2014. It’s essentially the same as a B+ with the USB end cut off. It’s the smallest, lightest, cheapest, and least power-hungry Pi of all so far. It’s 23g, $20 and uses just half a Watt at idle.

So Which One Don’t I Have?

I don’t have a Rev 2 256 MB variant. If you have one and would like to trade or sell it to me, I’d be happy to hear from you (alex AT raspi.tv).

I believe there is also now a red Chinese B+ I’ve not got one of those, but it’s only a matter of time. I wonder if there will be a red A+ at some point too? We Just Don’t Know!

 

 

TorrentFreak: If Illegal Sites Get Blocked Accidentally, Hard Luck Says Court

This post was syndicated from: TorrentFreak and was written by: Andy. Original post: at TorrentFreak

blockedThe movie and music industries have obtained several High Court orders which compel UK ISPs to block dozens of websites said to facilitate access to copyright-infringing content. Recently, however, they have been joined by those seeking blockades on trademark grounds.

The lead case on this front was initiated by Cartier and Mont Blanc owner Richemont. The company successfully argued that several sites were infringing on its trademarks and should be blocked by the UK’s leading ISPs.

The case is important not only to trademark owners but also to those operating in the file-sharing arena since the High Court is using developments in one set of cases to determine the outcome of legal argument in the other.

The latest ruling concerns potential over-blocking. In some cases target sites move to IP addresses that are shared with other sites that are not covered by an injunction. As a result, these third-party sites would become blocked if ISPs filter their IP addresses as ordered by the Court.

To tackle this problem Richemont put forward a set of proposals to the Court. The company suggested that it could take a number of actions to minimize the problem including writing to the third-party sites informing them that a court order is in force and warning them that their domains could become blocked. The third party sites could also be advised to move to a new IP address.

Complicating the issue is the question of legality. While third-party sites aren’t mentioned in blocking orders, Richemont views some of them as operating unlawfully. When the company’s proposals are taken as a package and sites are operating illegally, Richemont believes ISPs should not be concerned over “collateral damage.”

Counsel for the ISPs disagreed, however, arguing that the Court had no jurisdiction to grant such an order. Mr Justice Arnold rejected that notion and supported Richemont’s efforts to minimize over-blocking in certain circumstances.

“The purpose of Richemont’s proposal is to ensure that the [blocking] order is properly targeted, and in particular to ensure that it is as effective as possible while avoiding what counsel for Richemont described as ‘collateral damage’ to other lawful website operators which share the same IP address,” the Judge wrote.

“If the websites are not engaged in lawful activity, then the Court need not be concerned about any collateral damage which their operators may suffer. It is immaterial whether the Court would have jurisdiction, or, if it had jurisdiction, would exercise it, to make an order requiring the ISPs to block access to the other websites.”

The ISPs further argued that the Court’s jurisdiction to adopt Richemont’s proposals should be limited to sites acting illegally in an intellectual property rights sense. The argument was rejected by the Court.

Also of note was the argument put forward by the ISPs that it is the Court’s position, not anyone else’s, to determine if a third-party site is acting illegally or not. Justice Arnold said he had sympathy with the submission, but rejected it anyway.

“As counsel for Richemont submitted, the evidence shows that, in at least some cases, it is perfectly obvious that a particular website which shares an IP address with a Target Website is engaged in unlawful activity. Where there is no real doubt about the matter, the Court should not be required to rule,” the Judge wrote.

“Secondly, and perhaps more importantly, Richemont’s proposal gives the operators of the affected websites the chance either to move to an alternative server or to object before the IP address is blocked. If they do object, the IP address will not be blocked without a determination by the Court.”

In summary, any third-party sites taken down after sharing an IP address with a site featured in a blocking order will have no sympathy from the High Court, if at Richemont’s discretion they are acting illegally. The fact that they are not mentioned in an order will not save them, but they will have a chance to appeal before being blocked by UK ISPs.

“This action is about protecting Richemont’s Maisons and its customers from the sale of counterfeit goods online through the most efficient means, it is not about restricting freedom of speech or legitimate activity,” the company previously told TF.

“When assessing a site for blocking, the Court will consider whether the order is proportionate – ISP blocking will therefore only be used to prevent trade mark infringement where the Court is satisfied that it is appropriate to do so.”

Source: TorrentFreak, for the latest info on copyright, file-sharing and anonymous VPN services.