|Stream A - Future-state Security
|Stream B - Topical Issues
|Stream S - SABSA World Congress
|Stream P - Plenary Sessions
Today’s idea of a competent vision of IT security remains reactively driven by technical details. Each time event X occurs a technical security response Y is created, evaluated and implemented as part of a responsive strategy. Each of response impacts the properties of previous responses. We assume that our security objectives remain constant. For the last two decades I have challenged this assumption with the notion of anthropic security.
The sheer number of events across the enterprise infrastructure means that the dimensions of security problems can be and too complex to fully recognize and adequately communicate. This means that our security processes lead to inflationary restrictions within the fabric of the security enterprise. As security experts, I feel that we should desire to provide optimum throughput for our consumers, and to do that we need to change our view of the paradigm of security. If this is true the question is how shall do we do that?
As experienced security experts we arrive at this table with the innate understanding that we view concept of security through preconceived filters. The methodology of our present security policies are not fully capable of expressing the problems for the same reasons that our technical experts do not agree on the most advantageous perspective. Due to the inflexibilities of our present security language we need to establish better security paradigms that encompasses the complexities that we encounter. I therefore posed the possibility of creating such a paradigm through the broader notion of object oriented security.
In the late 1980’s I originated the idea of object oriented cryptographic key management security because I believe that an object oriented approach would change the way that we would be able to provide secure networked communications. Today I also believe that for such an approach to be successful our official policies and documentation also need to be structured and associated in such a way that we are able to model and see what we are doing. The first step is to change the way that we orient our language.
The representation of security should not be discussed as secrecy but rather as an act of efficiency and effectiveness. The value of such an approach is that as experts we shall be able to collectively avoid much of the unexpected entropy that results when we make a change in policy while utilizing cryptographic solutions as a means of improvement in the efficiency and effectiveness of network communications. This is something that I can argue we are not presently doing in part because we do not have a unified means to model and discuss explicit applications with respect to implicit philosophy.
I would like to offer an ambitious theoretical explanation of the relationship between the object oriented paradigm, and its relationship to fine grain access and content control that can achieve this. My intention is to discuss the reasons why Virtual Private Networking, while incredibly successful have not been adequately expressed. Finally, I shall also touch upon the reasons why objects are necessary to prevent the future continuation of splintering of the web.
If we are to believe that information theory and physics are essentially the same ideas expressed in different ways then information theory is bounded by the same consequence as physics, not by our devices and policies imposed upon it. If we as a collective civilization are not ready, willing and able to recognize, embrace and utilize our complexities then we shall be randomly affected by them.
My approach to discussing this shall be to open up the relevance of this topic as points within a less than fifteen minute set of observations which shall include a handful of slides and perhaps an animation. At the conclusion I shall break down the issues into three salient group discussion points.
The first of these discussion points involves the implications and opportunities of dealing with blind notions of unrecognized complexity and the necessity of mandatory degrees of freedom to obtain optimal solutions.
The second discussion point is the establishment of a comprehensive paradigm that maintains and manages the structure of cause and effect.
The third discussion point involves the importance of convergence which involves the recognition of where we as a technical society are going next, the aspects of technical security that such an approach fundamentally changes and what it means.
My goal is to initiate a global dialogue which changes the perception of proportionality between the explicit and implicit future expectations of security through intelligent and rational object oriented methodologies.
||Chaos, Cybernetics, Cynefin & SCAN in Enterprise Systems Engineering
The fields of Enterprise Architecture and Enterprise Systems Engineering strive to analyze human and technical systems from a premise that diverges from Newton’s principles of classical mechanics, which are by nature deterministic. This systems approach “shies away” from the reductionistic atomism of Newton and aligns itself with the more holistic systems method espoused by Ludwig von Bertanlanffy’s General Systems Theory. General Systems Theory advocates the idea that each system interacts with its environment in a real and meaningful fashion such that the environment changes the system and the system changes its environment. Thus, there is the possibility for evolutionary changes to systems, which may result in serendipitous events in our enterprises.
Unfortunately, even highly ordered systems may fall into disarray by developing various levels of randomness and disorder that are intensified by the systems inherent complexity and dynamicity. Both the Enterprise Architect and Engineer need to find suitable methods to manage the range of order from simplicity to ultimate chaos and disorder. Fortunately, several different systems approaches have been developed to address this dilemma: Cybernetics, Cynefin (pronounced “kuhnevin”), and SCAN.
Each of these methods offers novel insights into understanding and dealing with increased systemic complexity, which leads to uncertainty and increased risk. For this reason alone, it is well worth the time and effort of the security practitioner to become aware of their potential value to enhance and preserve their enterprises.>
||Increasing Resilience & Reliability of Software-Based Systems
Many of us have worked with systems which have proved less robust and resilient than we desired or expected. These flaws are often as a result of errors in the specification, design and implementation of the software they run. Worse yet, some of these flaws arise because of the very need to test software, and have produced high-profile disasters, such as that which afflicted the launch of Ariane 5. And why does software get worse quicker than hardware gets more powerful? Unfortunately, since each line of code can be regarded as a component, as software grows in size, reliability is bound to fall.
In this presentation I will discuss some of the changes in programming - both practice and language design - that could increase resilience and reliability. Topics will include:
- esource (RAM, CPU, disk, response time) budgeting, which is often completely ignored, yet used to be regarded as critical.
- Genuine semantic (rather than grammatical) checking by compilers, to catch errors earlier in the implementation cycle.
- Highly expressive languages, to increase productivity, and therefore leave more budget for testing. (But often at the cost of slower production code and run-time discovery of bugs which could have been caught earlier.)
- Run-time error correction, vs dual 'debug' vs 'production' code, which often means production systems contains untested code paths.
- Self-healing operating systems, languages, and systems.
- Fail safe probabilistic proofs of correctness and budget conformance.
||Slaying the Hydra: Evolution and Mitigation of Denial-of-Service Attacks
One of Hercules' first challenges was his battle with the Lernean Hydra, the many-headed mythological serpent who sprouted two new heads every time one was removed. Hercules would feel right at home in today's datacenters, where mitigation of distributed denial-of-service (DDoS) attacks can feel like an unwinnable game of Whack-A-Mole.
In the past few years, the magnitude of DDoS attacks has grown at a disconcerting pace. The largest DDoS attack in 2012 peaked at 100Gbps; the first quarter of 2014 brought a 400Gbps NTP amplification attack. Despite the security industry's best efforts to encourage protection of the end-user systems and patching of the vulnerable servers that enable these assaults, successful attacks seem to be taking place with increasing regularity and volume.
Denial of service is not a new problem; simplistic attacks such as ping floods and syn floods have been around almost as long as the Internet has existed. The rise of botnets, vast collections of malware-infected zombie systems, led inexorably to the appearance of distributed denial-of-service attacks. Attackers, too, have evolved: script kiddies harnessing the power of Metasploit, Anonymous launching the Low Orbit Ion Cannon (LOIC) against targets ranging from the US Copyright Office to the Motion Picture Association of America (MPAA) to PayPal, cyber-criminals using threats of DDoS as a method of extortion.
This session will provide an overview of the various forms of DDoS attack active today, who is launching them, and why. We will then discuss mitigation techniques that reduce the impact of and potentially stop the attacks entirely, comparing the benefits and caveats of each approach.
What happens when old school banking process bumps up against internet-era banking operations? Welcome to the nightmare of the OOB, or Out of Balance.
When modern banking operations collide with legacy systems, bizarre anomalies easily lead to events like a $10,000 loan being disbursed as a $10 million loan, which modern "self-balancing" software automatically corrects by generating a myriad of contra entries that no one understands, or even looks at. In this session you will see how a rapidly dwindling workforce of old-timers - banking gurus and wizards from the 70s and 80s - are leaving behind a set of processes that are difficult or impossible to maintain with commercial off the shelf software, and why those urban legends about a small deposit getting extra zeros added on to the end and going unnoticed are often true.
Data is being collected at an impressive rate today; much of which is not directly attributable to the nature of the activity where it is collected. This can be seen clearly in the use of the Internet. Data is collected about the past locations of the user or at least the browser. GPS companies are tracking locations and speeds of vehicles. Phone companies are tracking the location, routines, activity and friends of their customers. Games are collecting information about the users and the devices the games are played on. The list goes on and on.
We have many names for this: product development,target marketing, surveillance, security, and many more, but how deep does the data collection go? Are the custodians of the data practicing Good Stewardship? How much can actually be determined from this information? Is the data being used for the good of Humanity? Is it even ethical to dig so deeply into people’s private lives? Nationality aside is there a basic human right to privacy… do these efforts cross that line?
Big Data presents both opportunities and challenges to our current understanding of SIEM data. The very nature of Big Data allows for individuals to derive whatever is desired from the data, however, how do we gather meaningful information? Understanding how to get the most out of Big Data requires a mind shift that is opposite the training of security professionals. This talk begins by defining Big Data and the key architectural components of Big Data, it then moves to an explanation of data lineage and how data lineage can be used to inform and structure queries. Finally, we will provide examples that illustrate how SIEM data can be expanded in the Big Data environment to provide greater network situational awareness.
This session is designed to be interactive in nature. The presenter will introduce specific various topics and fully expect the audience to participate, question and debate the fine points. Attendees are encouraged to bring their own problems.
||Data Classification & Information Identification in the Age of Big Data & Linked Open Data
||Andrew S. Townley
Big Data. Er... what is it exactly?
A recent lead-in for an MIT Technology Review piece on Big Data from late last year highlights one of many problems anyone actually dealing with it needs to solve before they can seriously think about it from an information security perspective:
"Big Data is revolutionizing 21st-century business without anybody knowing what it actually means."
How can you talk about Big Data Security if nobody agrees on what it really means?
Linked Open Data (LOD). What's that?
According to linkeddata.org, Linked Data is about using the Web to connect related data that wasn't previously linked, or using the Web to lower the barriers of linking data currently linked using other methods.
Currently, Linked Data is mostly the domain of academics and still closely tied to the W3C's Semantic Web initiative, but quietly, a lot of people in many governments have been using the principles of linked data to publish previously unavailable information to the public using these technologies, including the US, UK, the EU, Brazil and Australia.
The primary driver listed in the Report on Digital Government: Building a 21st Century Platform to Better Serve the American People illustrates the crux of this session:
"We're moving from managing documents to managing discrete pieces of open data and content which can be tagged, shared, secured, mashed up and presented in the way that is most useful for the consumer of that information."
Looking at Big Data and Linked Data shows us that the way people, organizations and governments think about, publish and consume digital information is changing, and, in fact, already has changed. However, there are still far too many questions about what it all means, how to do it right, and specifically, how do we keep it secure where the definition of "security" is an open-ended context of potential uses?
In this session, we'll explore these issues and some of the ways our current thinking isn't ready for the world we find ourself in today. We will discuss the identity problem not from the traditional perspective of end-users as security subjects, but from the perspective of the data and information entities themselves. Finally, we will explore some potential ways we can leverage what we know in new ways to try to address these challenges.
Big Data is here and Linked Data is lurking in the corner. Don't underestimate either of them and the ways they will impact our role as information security professionals.
This talk will provide an insight into the resources and tools that should be considered when looking to gainan understanding of the risk associated with moving to a cloud environment and further discusses methods for assessing cloud service providers (CSPs) security, primarily focused on Cloud Security Alliance (CSA) methodologies and tools.
The review will specifically look at the CSA STAR (Security, Trust and Assurance Registry) a searchable registry which allows potential cloud customers to review the security practices of providers, accelerating their due diligence and leading to a higher quality procurement experience.
Additionally Ross will discuss the CSA CCM (Cloud Control Matrix) which can be leveraged to assess individual cloud providers which may not yet appear on the registry. Ross will demonstrate how the CCM overlaps with existing industry standards and regulatory requirements, with specific emphasis on PCI DSS nuances regarding scoping and network segregation in the cloud.
Several security certification schemes have been created for organizations, products and security professionals. These security certifications are used as “quality labels” – having the right kind of label means that there is a good level of assurance of security. So, if an organization or individual hands over sensitive assets to a certified entity there is no reason to be worried. Or if a product has a security label we can be sure that it works as it has been designed.
But what does a security certification really tell about the certified entity? Can we be sure that a certification - compliance with a set of security requirements means a good level of security?
This presentation discusses the limits of assurance – how much various types of security certifications give assurance and how much still must be covered by trust. Also the factors impacting the credibility of the certification schemes are analyzed – how the standards are constructed, the importance of auditor accreditation schemes, the individual auditor’s competence, organizational culture and management commitment among others.
The presentation is designed to have a practical orientation, and several different security standards and certification schemes are analyzed.
The purpose of this presentation is not to give the idea that certifications are useless, on the contrary. However, the limitations of certifications are essential to understand so that an organization can establish the right level of trust and controls.
||Reducing the Unknown Unknowns: Using SABSA to Improve Threat Modelling & Risk Assessment
“There are known knowns, there are things we know we know. We also know there are known unknowns, that is to say, we know there are some things we do not know. But there are also unknown unknowns, the ones we don’t know we don’t know.” - Donald Rumsfeld, 2002.
There is risk associated with everything we do, from crossing the road and driving to and from work, tothe use of information systems to support and enable business capabilities. While we accept the risk associated with most day-to-dayactivities without too much conscious consideration, we spend a large amount of time trying to identify and manage risks associated with the use of information technology. Despite this,many information security risks are still not identified or effectively managed – these are our “unknown unknowns”.
This session will be an interactive discussion exploring how threat modelling and risk assessment can be improved to reduce the number of “unknown unknowns”. It will cover a range of topics including:
- Techniques commonly used to identify threats and their relative strengths and weaknesses.
- Approaches to improving the quality and repeatability of threat modelling and risk assessment.
- Using SABSA methodologies and techniques to improve threat modelling and risk assessment.
- Presenting the relationship between the identified risks and the business opportunities, goals and objectives.
Risk is the downside of opportunity. “Nothing ventured, nothing gained”, they say. On the other hand: no goal, no risk. What do the traditional vectors of risk, being ‘chance of event occurring’ and ‘impact of event’, tell us about the relationships of goal and time with risk? And how does that reflect on current business practices?
Besides cyber risk, there is business risk, project risk, architecture risk, safety risk, etc. What is the relationship and proper balance between the management of those risks? Did the big banks that recently failed spend too much on cyber risk and too little on business risk?
For cyber risk management, ISO 27005 is a widely used standard. It helps to identify the areas where measures should be taken, based on a risk assessment. However, the risk assessment data often have a huge factor of uncertainty that is not displayed in the results matrix. So, does it really address risk or are we missing something here?
Finally, security controls are generally considered as a negative thing, a burden you need to have in order not to be vulnerable. I’d like to turn this around. A security measure offers protection to an asset or it contributes to achieving a security objective. Thus, security measures have a positive effect on the achievement of business goals and by qualifying this relationship it is possible to prioritize them according to added business value.
Westpac Banking Corporation is a Multinational Financial-Services provider.
Westpac is one of Australia’s "big four" banks, and is the second-largest bank in New Zealand. At COSAC 2009, we presented “A Journey Through Modernisation of an ESA at a Leading Bank”, and this year we are back to share our achievements, lessons learned, and next steps.
With this presentation we will demonstrate how the SABSA Method influenced the security architecture of Westpac's Strategic Investment Programs including, but not limited to, large scale:
- Enterprise perimeter security environments;
- On-line banking platforms and applications;
- Enterprise service bus implementations.
We will outline Westpac’s approach to managing the risks of current industry trends, such as infrastructure virtualisation, private cloud deployments, business asset zoning and service management.
Share our real-world lessons for “what can go wrong” in addition to the great things that “can be achieved”.
What EA framework do you use and why? Which is the best? If you are using EA, do you need an ESA (SABSA)? How about the opposite? If you are using SABSA, do you need to use an EA framework in addition?
We asked ourselves these questions and began to research. We found an existing document that performed a comparative analysis of 4 major EA frameworks in 2007. These four were TOGAF, Zachman, Federal Enterprise Architecture (FEA), and Gartner. We reviewed and updated the matrix and then added SABSA to the comparison. We then looked at the comparison matrix for any important EA requirements that were not included. What we found may surprise you.
Not for the faint of heart, this session will review this comparative analysis of EA Frameworks and will challenge participants to a debate that may change the way you think about enterprise architecture.
In this presentation, I wanted to share the implementation of data protection in a large enterprise that started from a 6-page presentation to a complete enterprise rollout. Learn how the business justification went through, how the standards were created, sensitive data discovery was performed, where encryption fit and most importantly did not. Also learn how certain sensitive data was tokenized and where tokenization is not a silver bullet, how Big Data was leveraged and where it was chaotic, how the non-production was secured with secure data management, and the whole thing monitored. The crucial component - a SABSA methodology based data protection security architecture and implementation using TOGAF framework.
||Spread Spectrum, Wireless Security & The World’s Most Beautiful Geek
For the Tony Sale Memorial session of 2014, we cover the unlikely pairing of an actress lauded repeatedly as the world’s most beautiful woman and a musical conductor whose best known work caused riots in the streets and fistfights in the theatres. The pairing was not romantic, but inventive. Way back in the early 1940’s, they invented and patented a “Secret Communication System” – the basis for solid security in a massive number of current devices.
She certainly didn’t look like a geek. And in her movies and interviews, she never sounded particularly technical. Yet Hedy Lamarr, frequently cited in the 1930’s, 40’s and 50’s as “the most beautiful woman in the world,” along with her co-inventor – composer George Anthiel, was granted the patent on a frequency hopping technique that serves as one of the cornerstones of security for wireless devices even today.
A rare mix of brains and beauty, Ms Lamarr never received a cent from any of her inventions. Hedy and George signed over the rights to frequency hopping spread spectrum to the US Navy, but the invention was so far ahead of its time that the Navy didn’t initially understand what it had.
We’ll cover her story, the invention and how it relates to security for the mobile, connected world of 2014 and beyond.
||The COSAC Rump Session
The hugely popular COSAC "rump" is an informal rapid-fire session in which participants give very short presentations on recent results, work in progress, and any other topic of interest to the COSAC community.
Presentations may be purely technical, entirely management oriented in nature, or of any combination of approaches or perspectives. Those wishing to give a talk at the rump session must submit a short abstract, no more than one page long, according to one of the following procedures:
- Electronic submission: Send email to the rump session chair David Lynas at email@example.com before 10AM GMT Friday, September 26.
- Hardcopy submission at conference: Hand the submission to David Lynas at the conference before noon on Wednesday October 1.
Submissions should include a requested amount of time for the presentation. An anticipated maximum of four minutes will be allocated for each presentation.