Alkahest my heroes have always died at the end

February 28, 2007

and people wonder why the change to DST worries me…

Filed under: Security,Technical — cec @ 6:38 pm

In 2005, congress mandated a change to daylight savings time, essentially, starting it three weeks earlier (March 11th, this year) and ending it three weeks later. Our local paper is requesting suggestions for what people will do with their 22 extra hours of daylight.  Here’s my suggestion:  spend the time fixing all of the computer problems caused by the DST change.

Essentially, a change to DST is very similar to a self-inflicted Y2K problem.  If you want to get a sense of how time/date issues can affect computer systems that aren’t prepared, take a look at this article regarding the new F-22 Raptor and it’s problems with the International Date Line. I wasn’t worried about Y2K because we knew about the issue for years and most modern operating systems and software had already addressed it.  With the DST changes, we haven’t had nearly enough notice or preparation.  Now, I’m not stocking up on canned goods, but I’m pretty certain that March 11th is going to bring about extra work.

November 24, 2006

Poindexter

Filed under: Random,Security,Social,Technical,University Life — cec @ 12:55 pm

It’s taken me a bit to write about Admiral Poindexter’s visit and the small group talk we had with him. Let me start by reminding folks that here’s a guy who was convicted of lying to congress. The conviction was later overturned on a technicality. He’s also very politically savvy. I once asked my father if he would ever pursue becoming a general in the army. He told me that he was hoping to make full colonel (he later retired as a lt. colonel), but that becoming a general required a literal act of congress and that you needed to become a politician. I would assume the same thing is the case with an admiral and doubly so in the case of Poindexter who managed to become the highest ranking geek in government. All of which is to say take my impressions with a grain of salt.

When I met Poindexter, he came across as a very kind, gentle and grandfatherly figure. He smokes a pipe and was more than willing to tell stories about his career. It seems that he started in the Navy in college, finishing up with a degree in engineering (w00t!). This was around the time the soviets sent up Sputnik. The first Russian satellite caused something of a panic in the US and, arguably, did more to encourage investment in science and engineering than any other event. The military’s response was to select 5 men from the army and 5 from the navy to pursue graduate degrees in science or engineering, anywhere in the country. Poindexter chose physics at Cal-tech. After discussing he trials getting into and then through grad school, he notes that he’s never taught physics, never been in a lab, never really used his degree, but it did give him a solid understanding of the scientific method.

After gradschool, he had several different positions and in each, he played the role of technology evangelist. One of the first to use computers in the Navy, set up the first video conferencing system among the nation security counsel offices, first to use email (on a mainframe!) in the whitehouse, etc. Like I said, the highest ranking geek in government.

Shortly after September 11, Poindexter was asked to head up the DARPA Office of Information Awareness (OIA) projects. In talking with him, I definitely have the sense of a man who loves his country and truly believes that terrorism is the greatest threat it has ever encountered. I disagree with him regarding the extent of the threat that terrorism presents, and so he and I may disagree on the appropriateness of the OIA, but unlike many politicians, I don’t think that he’s using the terrorism to advance other goals. I don’t believe that he’s hypocritical about his work.

So, what is his work? One of Poindexter’s chief complaints is that he (and TIA) were unfairly maligned in the media. If you recall, TIA was presented as a giant “Hoover” of a database. The government would collect information from a number of private sources and perform data mining on it in order to identify (potential) terrorists amongst us. Lots of us whom are concerned with security and privacy were worried about this. The privacy angle is disturbing enough, but from the security stand point, you are creating an attractive nuisance. The first hacker that comes along and can get through the governments security measures is going to have a huge amount of data. Consolidating databases also increases the likelihood that the businesses involved will use the information. For example, can you be denied insurance if you are overweight, but grocery records indicate you buy junk food?

Beyond the privacy and security concerns was the very real question of how this was going to work, i.e., would it really keep us safer? Traditional data mining techniques find statistically significant patterns in large data sets. Terrorists (one hopes) are not statistically significant – unless there are a lot more of them. This is actually one of Poindexter’s complaints – that his proposal should never be called data mining, data mining won’t work. He was working on a “data analysis” system.
In his presentation, Poindexter tells us that the media got it wrong. He never planned a single huge database. Instead, he planned to leave the data where it was and to build a distributed database on top. Each participating database would make use of a “privacy appliance.” The privacy appliance would be connected to a query system and would anonymize the data before sending it to the query system.

To detect terrorists, he would have a “Red Team.” This is the group that is intended to think like terrorists. Their job is to hatch plots and to determine what it would take to implement the plots. For example, blowing up a building might require large amounts of fertilizer and fuel oil. Purchasing these supplies would leave a footprint in “information space.” The Red Team would pass this step along to the analysts who would then query the system with this pattern to find anonymous individuals matching it. Of course, purchasing fuel oil and fertilizer would flag every small farmer in the country. So the Red Team would go back and look at step two, perhaps renting a large van. New query pattern, new search. Repeat until you either don’t find anyone, or until you are specific enough to get a legally authorized search warrant.

Poindexter also notes that this was a research and not an operational program. That the “total” in TIA was meant to encourage researchers to think broadly. Finally, that the reason the privacy part did not get off the ground sooner is that none of the researchers were interested in this aspect – they only received two privacy proposals.

Interesting idea. A few problems:

  1. I’ve gone back through the documentation available at the time and I see nothing about either red teams, distributed databases or privacy appliances. The early architecture diagrams all seem to indicate a monolithic database.
  2. It’s still not clear to me that this will work. The red teams will have to come up with millions of patterns and even then, you are not guaranteed to come up with everything.
  3. Regarding research vs. operational. This is a lovely thought, but at the time, iirc, there were reports of TIA receiving real data. In fact, even as a research project, it would need real data in order to test.
  4. Regarding the “total” in TIA – that was a pretty scary logo if that was the case.

So, it may be that this is a refinement of the original ideas. In which case, they seem like a good refinement. From the privacy and security standpoint, this seems to be better suited that the original ideas. However, I don’t think that Poindexter was being entirely forthcoming.

All in all, a very interesting data and a very interesting man.

November 13, 2006

Admiral John Poindexter to speak at Duke University

Filed under: Security,Social,University Life — cec @ 12:06 am

In case you are looking for something interesting to do next week, go to Love Auditorium at Duke University on November 15th at 5pm. Admiral John Poindexter will be giving a talk: “A Vision for Countering Terrorism Through Information and Privacy Protection Technologies for the 21st Century.”  It should be an interesting and thought provoking discussion of where Poindexter draws the line between security and privacy.

I’ll be meeting with Poindexter and a small group at 3pm – quite the birthday present.

Finally, since I assume that TIA exists, I’m guessing that Poindexter is reading this as I’m posting it, so, “Hi!  Looking forward to meeting you on Wednesday.”

November 6, 2006

North Korean nuclear test

Filed under: Security,Technical — cec @ 8:03 pm

Well, it seems that we finally know what happened with the North Korean nuclear test that fizzled.  They apparently mistranslated the Arabic documents the U.S. posted online.

Okay, so neither of those is really very funny.  On the one hand, the U.S. posted a whole host of Arabic documents from Iraq that had never been examined before in the vague hope that someone would be able to find evidence that Iraq had a WMD program before we invaded.  This was idiotic.  It’s equivalent to my posting an entire database of personal information in the hopes that someone online could determine if there were social security numbers in it.

On the other hand, we’ve got a foreign policy failure under this administration that resulted in one of the most unstable countries in the world building a nuclear device.  I know that it’s been said that the weapon was a dud, but I haven’t seen any recent analysis on this.  Determining destructive yield from seismic data depends on the magnitude of the quake, the depth of the explosion and the matrix it was contained in.  Last I heard, the sub-kiloton results were based on hard rock and a magnitude of ~3.8.  The USGS says the magnitude was 4.2.  If the matrix was softer, this could easily be a 5 kiloton weapon.  But then, I’m not a nuclear proliferation expert, so I could easily be missing new data.

October 23, 2006

Biometrics – fingerprint scanners

Filed under: Security,Technical — cec @ 10:00 am

I recently had a small argument with a vendor selling biometric fingerprint scanners tied to your credit card number.  He said that they were the greatest and most secure thing ever; I said that there weren’t any standards and that the security of the devices was questionable.

I wish I had seen this earlier.

YouTube Preview Image

October 13, 2006

Thinking about security and usability

Filed under: Security,Technical — cec @ 11:06 pm

IT security (and for that matter, other security concerns too) are often seen as conflicting with usability. There is something to that. If you take any given technology and turn up the level of security it provides, you will almost always decrease the usability of the system.

Consider passwords. If people are allowed to choose their own passwords, they will typically choose something very usable for them. They’ll pick their dog’s name, their wife’s name, their userid, etc. These passwords don’t provide much security. To compensate, we often turn up the security knob and require “stronger” passwords, e.g., minimum of six characters with no dictionary words and multiple “character classes.”

security-usability.pngAdjusting the password strength knob is reasonable to an extent. I’ve recently heard security officers consider requiring fifteen character passwords with multiple character classes. Such a password is unusable. Any system that requires that level of security should not be protected by user chosen passwords and possibly not by passwords at all. To maintain usability, while increasing security, you have to use a new technology.

Consider the graph to the right (click for a larger view). The graph illustrates this principle. The blue line represents a given security technology. As you increase the security, you decrease the usability. In such a security-usability graph, we really want to be in the upper right corner of the graph. But our blue line can’t get us there. When we make the passwords more complicated (secure), they become less usable. To get further up in the graph, we need to change the technology and shift the security curve to the right (the green line). For example, we might allow weaker passwords but require two factor authentication with a smart card.

Unfortunately, many proposed security technologies might even shift the graph to the left (the red line). These technologies provide less security for the same degree of usability.  Think of the prohibition on liquids while flying.  This provides no increase in security, while greatly decreasing the usability (or at least the enjoyability) of flying

security-usability2.pngIf we’re lucky, our security curves don’t look like the graph above and instead look more like the one to the left (click for a larger view). The advantage to a curve like this one is that there’s a fairly natural optimal point. We can increase the security while barely affecting the usability – at least up to a point.

I don’t know what the security curves for most technologies look like. But security technologists need to consider this and determine both the level of security and the level of usability needed in a given system. If you can’t achieve both, then you might need to think about a different approach or a different security technology. Trying to achieve a desired level of security without considering usability will result in the users ignoring or bypassing security in the future.

Just some thoughts.

October 1, 2006

for the record, Kip Hawley is an idiot

Filed under: Security — cec @ 2:05 pm

I missed this when it came out last week, but apparently, a gentleman named Ryan Bird was detained at the airport last week for writing “Kip Hawley is an idiot” on his plastic baggie filled with toiletries. Apparently, the security trolls highly educated and diligent TSA employees took the statement to be a threat or at least behavior that they didn’t approve of and detained Mr. Bird – while telling him that he didn’t have 1st Amendment Rights at the TSA checkpoint.

Kip Hawley is the current director of the TSA and, yes, he is an idiot. This is the guy whose organization seems to watch more bad action movies than they read real risk assessments.  If you recall, a month or so ago we had a complete ban on liquids in planes due to the fear of binary explosives the ban was later lightened to allow small amounts of toiletries on the plane *if* they were put into clear baggies so that they could be viewed. The problem with this is that it was not based on any science or chemical engineering knowledge. The feared binary explosives are notoriously difficult to produce, would require hours of work and produce noxious smells which would (one hopes) be noticed.

Unfortunately, this is characteristic of the way we are handling transportation security these days. Rather than assessing the risks and taking reasonable actions based on the assessments, we are running around trying to look like we’re doing something. This goes for removing shoes to limiting liquids to arming pilots – none if it makes sense in a security context.

September 25, 2006

Presentation

Filed under: Personal,Security — cec @ 9:14 pm

I survived giving my presentation today – in spite of the fact that I showed up to the wrong hotel, in the wrong part of the city.  I blame my boss.  I mentioned that the talk was at the Sheraton Imperial (although I hadn’t looked to see where that was yet) and he said, oh yeah, the one down on Campus Walk road.  Get to the hotel on Campus Walk – oops, that’s the Millenium.  Call my administrative assistant, she looks up the location and it’s 15 miles away.

I still made it on time.  Had a room of 25 people or so.  I asked, no one wanted to do the more interesting version of the talk (the one with volunteers, paper airplanes and silly hats).  I’m not sure why.  Maybe because I didn’t have any silly hats on me.

There was a reception at the end of the day (about a half hour from when I finished).  As usual, I didn’t stay.  It’s funny – this is the real difference between introverts and extroverts.  I am fairly introverted, but at the spur of the moment can still give a dynamic, well received talk to a room full of people.  I’m not self conscious about being goofy or making jokes.  What makes me introverted is that after that, I’m done.  I’m just out.  I want to go home, have a drink, relax.  If I had gone to the reception, I would have sat in the corner with my drink ignoring everyone.  Introversion and extroversion have nothing to do with presentation anxiety.  The difference is that extroverts get energized and are ready for more.  For introverts, not so much.

September 24, 2006

Oops…

Filed under: Personal,Security — cec @ 9:22 pm

Checking my schedule for tomorrow, I realized that I have to give a talk at a major “human studies” conference about security risks in web-based surveys.  Unfortunately, I haven’t actually prepared anything.  I’ve got my slides from the last time I did the talk, but I really wanted to do something more interesting and interactive this time – preferably with audience volunteers wearing silly hats.  I may still try to put something together in the morning.  I have an idea – but no sense of the feasibility.

Unfortunately, I don’t have the silly hats.

Stock spam

Filed under: Security,Technical — cec @ 6:26 pm

One of the disadvantages of having so many email accounts is the number of spam you get. Recently, I’ve been noticing an increase in stock spam making it through my spam filters. I’ve been wondering how effective the spam is and whether or not one could make money shorting these stocks.

Apparently, I’m not the only one. The local paper carried a NYT article titled “Many people fall for stock spam.” In the article, the author describes the work of Frieder and Zittrain. Frieder and Zittrain found that pink sheet stocks that were heavily touted in spam were significantly more likely to be traded than non-touted stocks. Purchasing these stocks would lead a 5.25% loss within two days. For the most heavily touted stocks, the average loss was almost 8% in two days.

To get a sense of what these look like, I read through the 700+ spam messages collected in my spam folder over the past week. I feel like I’ve been dumpster diving. However, amid the emails claiming that I can enlarge body parts, get cheap watches and drugs, improve my sex life and buy human growth hormone, I found a few dozen messages touting 10 different stocks.

Looking at the stocks online shows that, sure enough, in the day or two around the time I got the spam, there was a substantial increase in the trading volume and in several cases, there was noticable increase in the share price. Now if I really wanted to test this, I would start selling these stocks short any time I received stock spam. Figure maybe a thousand dollars per stock. A 5% drop on a shorted stock in two days is nothing to ignore 🙂

« Newer PostsOlder Posts »

Powered by WordPress