JavaOne logoI am presenting on security at JavaOne this year — whoot!  My session is entitled, “CON12803 – Making the Future Secure with Java”.  Full session details are at end of my post.
The energy for the conference is phenomenal.  Everyone is working super hard on their presentations and doing lots of multitasking to keep things going.  Most presenters are also spectators, myself included.  It’s great fun to see all the projects and share knowledge, see what’s new in the vendor space, and more.  I’m really looking forward to the event.
As the Java security guy, I receive many questions around Java platform security.  Anything ranging from details about vulnerabilities, which I don’t discuss, to details about future plans for Java security.  I’m still pretty new on the job but due to the tidal-wave of questions around security, a presentation on the topic is very appropriate and what better venue to present than JavaOne.  The session is intended to provide attendees some background information about our security programs as well as future direction for Java platform security.  See you at JavaOne!
Presentation Details
“CON12803 – Making the Future Secure with Java”

Monday, Oct 1, 8:30am – 9:30am
Hilton San Francisco – Continental Ballroom 7/8/9
(JavaOne media:

smoke stack
Sustainability News Photo [5]

[Updated Post Friday October 25, 2013]
 The pollution vs. security analogy still resonates with me but there is a flaw in the model I’ve been pondering – and it’s significant.  Most often (but not always) pollution is visible.  When your stuck in traffic smog hangs like a cloud.  The air stinks.  Likewise, a polluted river or stream is usually easy to spot.  The breakdown in the model is that poor security not easy to identify.  Although the effects of poor security usually become apparent.

[Original Post] 
Sure, computer security sucks but who’s to blame?  Do you blame the hackers?  These criminals are the pirates of the Internet, exploiting vulnerabilities in our applications and plundering our data!  But wait a minute, what about the careless application programmers writing vulnerable code?  The test team failing to discover the security flaws?  How about the companies we trust to guard our personal information?  Likewise, the Governments we trust to keep us safe?  Who’s fault is bad security?

In September of 2010 I attended the Open Web Application Security Project(OWASP)[3] conference in Irvine, California.  David Rice, leader of the Monterey Group[1] and author, provided a powerful key note presentation[2].   It’s a message that resonates loudly with me today.  Mr. Rice draws an analogy between pollution and security.  In the heyday of the industrial era pollution was seen as the mark of prosperity and if you not polluting your not prosperous.  After decades of environmental destruction and sickening our people, laws and attitudes changed.

Today we see pollution as toxic to the environment and it is not an acceptable consequence of successful business.  Security is following a similar trajectory.  Today poor security is largely an acceptable consequence of writing software.  After all hackers are really smart and writing code without vulnerabilities is hard – right?  Writing secure code costs lots of money, stifles creativity, and productivity – correct?  Our parents heard similar, “can’t have your cake and eat it to”, arguments but it was the 50’s and the subject was industrial pollution.  Play it forward, now it’s almost exactly two years to date since Mr. Rice’s OWASP key note.  What’s changed?  Attitudes, if however slight.  People are fed up and begging for some accountability.

I saw an article[4] on TechRepublic about holding software developers accountable for their security vulnerabilities.  The argument is that if your hamburger is poisoned you can sue the restaurant.  But if a computer hacker steals your credit card information, destroys your credit rating, and makes your life a miserable hell, why can’t you sue the software developer?  For starters, your not bound by a End User License Agreement(EULA) when you eat a hamburger.  On the other hand, last time I clicked to accept new terms for iTunes on my iPhone it was more than 60 pages.  That’s more paperwork than I received when purchasing my first home!  Honestly, I don’t know what the hell I accepted but I guess somehow it’s legally binding.

Today products are generally considered secure unless they are proven vulnerable.  Consumers have no good way to evaluate product security without expensive testing.  It’s too easy for companies to say products are secure, roll the dice, and risk falling short on promises.  As a result, there’s really is no incentive to make deep investments necessary in security.  Invest too deep and your products are not competitive.  To be clear, nobody ever says do a bad job, what happens is that security resources are not funded to match the level of risk.

For epic change to occur, businesses must compete on a level playing field.  If product security posture were as easy to evaluate as choosing a loaf of bread in the supermarket all vulnerabilities would be fixed overnight.  What seems more likely is accountability will be established through changes to laws and litigation, or yup you guessed it, regulatory compliance.  I’m not a big fan of more government in my life but then again such regulation has been overall beneficial for the environment.

An aside, on NPR News[6] not long ago, I heard that plants producing frozen pizza are inspected by the Food and Drug Administration but frozen pizza with meat is inspected by the U.S. Department of Agriculture.  The government needs to draw the line somewhere and evidently it’s directly down the center of your pizza.  Likewise, any regulatory path for security will produce few a mushrooms along the way.

[1]  The Monterey Group,

[2]  Rice, David. “OWASP AppSec USA 2010: Keynote: David Rice 1/3.” David Rice Key Note. Irvine, California, 2 Sept. 2010. YouTube. OWASP, 04 Dec. 2010. Web. 03 Sept. 2012. <>.

[3]  The Open Web Application Security Project,

[4]  Heath, Nick. “Should Developers Be Sued for Security Holes?” TechRepublic. TechRepublic, 23 Aug. 2012. Web. 02 Sept. 2012. <>.

[5]  Sustainability News Photo. Digital image. Climate Protection. City of Las Vegas, 3 Sept. 2012. Web. 3 Sept. 2012. <>.

[6] Naylor, Brian. “U.S. Considers Overhaul Of Food Safety System.” All Things Considered. National Public Radio. California, 4 Sept. 2012. NPR. NPR, 25 Feb. 2009. Web. 04 Sept. 2012. <>.

I recently purchased a 2012 Toyota Prius C hybrid, great car.  I really love the gas millage and the fact it’s easy on the environment is an added bonus.  Returning to the house with my new car, I cracked open the owners manual[1] for the first time.  On pages 18 and 19 I noticed a section, “Vehicle control and operational data recording”, hum this is interesting.  The manual goes on to say the vehicle is equipped with sophisticated computers recording vehicle operation.  My first thought was — black box — like type found on modern aircraft, only in my car.  Toyota calls the black box the Event Data Recorder (EDR).

The Prius manual describes the following operational parameters subject to recording.

  • engine speed
  • electric motor speed
  • accelerator status
  • brake status
  • vehicle speed
  • shift position

While it’s clear these settings are recorded, it’s not clear what else may be recorded.  For instance, Prius purchased with navigation system options also have GPS coordinates and time/date information.  Knowing where and when events occur makes them much more valuable.  The manual specifically notes no conversations are recorded, sound, or pictures.  Oh, what a relief.  Toyota notes the data is used for research development and to improve quality.

Finally, the Prius manual goes on to say Toyota will not disclose EDR data to third parties except…

  • Upon consent of owner/lessee
  • Official request by police, court, or government agency
  • Research not specific to owner or vehicle

The first bullet is Me, the Prius owner.  I can approve the distribution of my EDR data to a 3rd party.  Ok, makes sense.  The second bullet, police, court, or government agencies — this concerns me.  The effect of this is that if your involved in a crash, local, state, or federal officials can collect your EDR data without your knowledge or consent.  Similarly, EDR data could be gathered during a surveillance operation when you take your car to the dealers for an oil change.  Not likely, I admin, but if it’s possible then it’s safe to assume such surveillance will occur at some point.  The concern is that EDR data can be used against owners in a court of law or for surveillance purposes.  While you may be able to exercise your 5th Amendment Rights in court, it’s likely your car is not subject to such laws.  One trip the Electronic Frontier Foundation[2] will convince you our laws have not caught up to our technology capabilities — to the detriment of our privacy.

In my opinion, until privacy laws improve, I wish manufactures would not provide features with no immediate consumer benefit that may potentially be used to violate privacy.  I’m not talking out doing away with the Internet or light bulbs — these have huge consumer benefits.  I considered writing Toyota and asking how to deactivate the EDR, after all it’s for research and quality control according to them.  It should not be important to the operation of the vehicle — right?  Perhaps there is some Prius hack info on the net.

[1] “Toyota Prius C Owners Manual.” Toyota. Web.<>.
[2] Electronic Frontier Foundation, <>