
In case you missed any of the 2016 Federal Forum or were unable to attend, here is your opportunity to further your education on how to modernize your network infrastructure.
On this site you can:
- • Download the e-book that highlights the best of the 2016 Federal Forum
- • Dive into eLearning courses focused on the technologies that will improve your network performance
- • View 14 real-life demonstrations that exhibit the most advanced networking technologies
This is your opportunity to learn how to modernize your agency’s network. Take advantage of this access and feel free to share with your colleagues.
Catch up on the 2016 Federal Forum Highlights
eLearning Courses: Deep Dive by Topic
Additional Resources

Network Security
New Security Architecture for Federal Networks

Federal networks are under constant attack. It’s been estimated that there are over 3 trillion cyber-attacks carried out annually, a mind-boggling number. Federal agencies need a new set of security principles to prevent these relentless attacks from disrupting the delivery of vital services to users, the warfighter and the general public.
Older networks are vulnerable to attacks due to their static nature, as well as the inefficiencies of hardware only-based solutions. Many federal IT leaders think their data is being protected in flight, but often this is not the case. Most security policies were developed decades ago and have not kept pace with rising network performance levels and user expectations.
The private sector has been moving away from perimeter-based security for many years. The massive OPM data breach last year was made more damaging by the fact that once intruders penetrated the outer network defenses, there were no additional layers of security to hamper their access to government data.
Newer software-based solutions make possible a dynamic, layered security approach that offers protection from the device all the way back to the data center. By leveraging Network Function Virtualization (NFV), services such as firewalls, VPNs, routing and load balancing can be managed in real-time, without the need for physical deployment.
As networks are getting faster the data-protection tools must scale to match data rate increases.
As agencies consolidate data centers and share resources, data is traveling farther away from the data center, while data rates on the network are increasing. Protecting this data — both in the data center and while in transit — is a critical concern. A recent Market Connections survey found that 76 percent of agencies have encryption protocols in place for data in flight, but 62 percent of those are using SSL when more sensitive data transfers require a minimum of 128 bit encryption strength for secret traffic and 256 bit for top secret traffic. And as networks are getting faster the data-protection tools must scale as these data rates increase. Many antiquated encryption tools being used cannot operate at 10 Gbps or higher, which is quickly becoming prevalent in Federal IT infrastructure environments.
A new kind of network architecture is needed for real cybersecurity — giving you the ability to customize security — by geography, function or application. Security becomes pervasive, and the network “learns” what constitutes an anomaly. And since this is all being done with software, CapEx costs can be reduced as much as 90 percent.
Not only are current static security practices insufficient for today’s threat environment, they are incapable of supporting the government’s move to cloud computing. In the cloud era, the lines of demarcation blur between network and data center. All parts of the network need to be enabled with dynamic security and play a role in the overall cybersecurity strategy. This can happen when network security is:
- • Designed in, not bolted on
- • Open, not closed
- • Self-learning, not static
- • Act on behavior, not just identity
The “cyber sprints” launched last year by U.S. CIO Tony Scott have raised awareness in federal IT circles for the need for a new security paradigm. Contractors and commercial hyper scale companies like Google and Amazon who have built more secure networks can share their knowledge with government agencies. There is a proven roadmap for achieving better security through a more open, multivendor and software-centric network.
There is never a final answer when it comes to cybersecurity. The threats are dynamic and ever-changing. That’s exactly why the networks have to go beyond static and become more dynamic and flexible in response. The challenge never ends, but the network security evolution needs to start today.
Additional Resources on Network Security:

Acquisition
The Time is Now for Federal IT Acquisition Reform
Federal IT decision makers operate in a difficult environment. The budgetary climate is challenging, and approximately 80 percent goes towards keeping increasingly obsolete infrastructure operating.
The ultimate goal of government IT is the delivery of services to the citizen, the warfighter and the veteran. Innovation has stagnated, and IT performance is starting to degrade the mission rather than enable it. Reforming the IT acquisition process is an important step the federal government can take to reverse these trends.
The failure is one of execution more than design. The Federal Acquisition Regulation (FAR) is supposed to encourage “competition to the maximum extent possible.” This is the correct goal, since competition for government IT improves performance, drives innovation and reduces costs. If the federal IT acquisition cycle as defined by the FAR was actually followed, Federal IT would be in a much better state today.
Unfortunately, current acquisition practices do not follow the prescribed cycle and undercut real competition for Federal IT solutions. Requirements documents are too prescriptive, trying to scope the solution rather than the IT need and often specifying brands. There needs to be alignment between IT requirements and mission outcomes.
Weak governance and accountability allow critical steps like market research to be ignored or outdated information to be used. The old expression “garbage in, garbage out” regrettably applies. Overall, a “check box” culture fuels an acquisition status quo that serves no one well — agencies, contractors or citizens.
There needs to be alignment between IT requirements and mission outcomes.
In addition, the broken IT acquisition culture is hampering the move of government IT to the cloud. Here are some steps that can start to break the cycle of inefficient Federal IT acquisition:
- Increase market research and link requirements to agency mission outcomes defined as functions, capabilities and service levels absent of brand names.
- Enforce the use of open standards in RFPs and IT implementations and prohibit the use of proprietary protocols.
- Embrace multivendor IT implementations to ensure flexibility, innovation and competition to return the best value and lower the cost of IT.
- Become more agile by refocusing OpEx funds from aging infrastructure support to acquiring innovative technology via new acquisition models like ITaaS and cloud. With limitless upgradability, scalability and agility, agencies can eliminate waste and ensure that they deploy exactly the assets and technology they need, when and where they need it.
- Adopt a strategic approach, rather than a product or vendor-oriented approach to IT and networking investments. This means leveraging new companies and technologies to build large, secure and scalable environments at lower cost, with interoperable, multi-vendor solutions based on open standards.
- Train and educate the workforce to embrace and encourage competition by harnessing the FAR.
Taking the steps above is the best way to embrace the promise of the cloud and as-a-service delivery models. The way acquisition is practiced today ties the government to a hardware-based, proprietary technology model of IT infrastructure that the private sector is rapidly leaving behind. Executing the FAR properly can unlock the performance, efficiency and simplicity of networks driven by software and open standards.
The ITMF, as well as its proposed alternative the MOVE IT Act, are both encouraging steps for acquisition reform. By shining a light on the current dire situation for federal IT innovation, these bills clearly illustrate the need for a better road forward.
Networking technology is being transformed — moving from static to dynamic, closed to open, high CapEx to utility-like pricing. Previously IT networks were used for the tactical transport of data. Now they can be strategic enablers of the mission.
Additional Resources on Acquisition:
Debunking the Myths of Network Modernization
Improving Agency IT with FITARA

New IP
Federal IT Networks Require a New IP

Today’s federal IT networks are at a crossroads. As compellingly explained by U.S. CIO Tony Scott at the recent 2016 Federal Forum, government networks are locked into old technology and not innovating. When viewed across all agencies Scott said the situation has become “insanity at scale.”
Starting in the late 1960s, government began to leverage IT primarily as a way to automate previously manual processes. Each process within each agency was given a line item for such automation. Initially, 15-20 percent of the money was budgeted for maintenance.
The enormous problem today is that most agencies are locked in the 1980s-era IT, and the percentage of dollars spent on maintenance has done a 180-degree flip. Scott estimated that agencies now spend approximately 80 percent of their IT budgets on maintaining current technology and programs, with little or no funds left over for innovation or system enhancements.
Meanwhile, private sector IT has raced forward, roughly doubling capacity every five years. Hyperscale companies such as Google and Amazon have leveraged new technologies to unlock gigantic value and productivity that consumers experience every day. They have dramatically raised expectations of what IT can accomplish, and have given the federal IT executives a clear road map to follow.
Government needs a modern networking architecture that harnesses the power of Cloud, Mobile, Social and Big Data technologies.
What’s needed today is the New IP which is a modern networking architecture that allows government to harness the power of Cloud, Mobile, Social and Big Data technologies. The New IP is about enabling network infrastructure to meet the exploding demands of users. It is grounded in a set of principles: open with a purpose, automated by design, programmable, user-centric, systemic and evolutionary.
It begins with infrastructure upgrades to fabric-based physical foundations and evolves to software-defined virtual services and advanced methods of control and orchestration. This is an end-to-end proposition that federates network, servers, storage, applications and the edge to deliver the unified and rich experience that federal end users, citizens and war fighters demand.
Components of the New IP
Today’s federal agency networks must evolve in five main areas to take advantage of the New IP:
- Open Standards – Standards-based products enable choice that increases flexibility while reducing cost and complexity. These benefits help accelerate the rate of innovation. Agencies that take advantage of open standards are more agile and better positioned to adapt to advances such as software-defined networking (SDN);
- Multivendor Environments – These control costs and increase innovation through competition. It encourages the use of best-of-breed solutions and puts the federal government in charge of their network. It also prevents vendor lock-in and ensures continuous innovation in functionality and performance;
- Ethernet Fabrics – are flattened architectures that simplify the network by replacing traditional point-to-point relationships. These software-enabled tools take cues from the software environment to prompt the hardware. They offer simplicity and automation, reduce network complexity, and are a core component of the New IP;
- Network Function Virtualization (NFV) & Software-Defined Networking (SDN) – The freedom to programmatically control the way data flows through a network eases manageability, supports automation, and helps administrators more quickly deliver customized services in minutes instead of days or weeks;
- Alternative Procurement Models – Opting for a vendor-neutral, requirements-based approach allows agencies to choose from a wider variety of solutions to meet their price, performance, and flexibility needs. In acquiring those solutions, agencies can use resources more efficiently through an alternative “pay-as-you-go” approach that spends Operating Expense (OpEx) dollars rather than Capital Expense (CapEx) funds.
A recent survey by research firm Market Connections shows that significant challenges exist in getting the government on the path to the New IP. 90 percent of IT decision-makers say open standards are important, but only 47 percent are planning to adopt. On a more positive note 70 percent are considering broader adoption of SDN.
The network of the future is a heterogeneous environment. Agencies need to ensure that they work with best of breed partners for all elements of their network — edge, data center and storage.
The right partners can provide government with a networking infrastructure on par with the leading hyperscale 2.0 companies of today. And as Tony Scott made clear, the transformation needs to start now to the New IP.
Additional Resources on New IP:

Machine Learning
Machine Learning is Changing the Way Humans Experience Technology

The manifestations of machine learning touch almost every part of our lives. Yet people often do not make the connection between machine learning and the amazing new products and services becoming broadly available today. Throughout human history knowledge was delivered from evolution, from experience and through culture. Now knowledge is also created by machines.
Take for example how Apple’s Siri assistant and or Amazon’s Echo learn user preferences over time. Autonomous cars powered by Google are in large-scale trials in multiple states. We are slowly becoming accustomed to speaking with machines, and expecting them to learn over time our preferences and needs.
The technological development making these new services possible is a seismic shift, on par with the development of the Internet itself. Hyperscale companies like Google and Amazon have led the way over the past 10-15 years, focusing first on dramatically scaling compute and storage capabilities. Today machine learning is re-imagining the network component as well.
"In the past humans wrote code to tell networks what to do — now networks learn what to do themselves."
David Meyer, Chief Scientist, Brocade
“The acceleration of machine learning — a subfield of artificial intelligence — is the most important trend I’ve seen in my 35 years working in technology,” said David Meyer, Chief Scientist at Brocade, who presented at the 2016 Federal Forum in June. “In the past humans wrote code to tell networks what to do. Now networks learn what to do themselves.”
This ability to learn is powering a profound change in IT networking. Previously networks could do nothing without specific coding — for orchestration, packet forming, routing. Now machines can do this themselves, dramatically increasing performance and fundamentally changing the role of the network engineer.
Here is a short list of networking capabilities made possible by machine learning:
- Security/Anomaly detection — Recognize malicious network traffic and alert the analyst (in real time).
- Network Function Virtualization (NFV) orchestration and optimization — NFV is a major trend in service provider automation, and to quote Chris Wright, VP and CTO of RedHat, “machine learning is the way we are going to automate your automation.”
- Improved automation tools for DevOps — As in the case of NFV, machine learning is the next phase of automation, which we will see in almost all aspects of our technology.
- Prediction and mediation of mobile network problems — Machine learning can, among other capabilities, learn to predict anomalous protocol transactions which can signal nascent problems in the mobile network.
- Operator/analyst intuition capture — Perhaps surprisingly, deep neural networks are able to capture the intuition of human operators or analysts, endowing such systems with human-like prediction capabilities (and beyond).
Most of private industry is now progressing down the path blazed by the hyperscale leaders. Federal IT, already stuck with decades of old IT infrastructure, cannot afford to be left farther behind. Both federal networks and federal IT staff need to leverage the learnings of leading vendors whose infrastructure is powering the biggest service providers in private industry.
The right partner isn’t needed just for existing products, but also for where machine learning can take us. Technological development can open up possibilities not yet perceived. After all, Amazon didn’t massively build out its infrastructure to become a public cloud provider, but now the company leads the market.
With technology progressing so rapidly, constant education is vital. Freed from the constant need to simply “keep the lights on,” data center managers and network engineers can become more like service brokers inside their agencies, helping internal groups harness the power of machine learning.
There has never been a more exciting time in IT than today. And it’s past time for federal networks to jump on board.
Additional Resources on Machine Learning:

Mobility
Preparing Government for the Mobility Revolution

Society is in the midst of a mobility revolution. For most citizens today, a smartphone is becoming their primary computing device. Mobility has become the rule rather than the exception, and users expect to have wireless connectivity anywhere, anytime to their personal and professional information. Download speeds and the overall user experience have improved dramatically in just the past few years. Network providers must embrace the right technology to meet these soaring mobility expectations.
Service providers in the private sector have responded to this shift and are designing and deploying networks that use the latest wireless technology to deliver unprecedented performance. New York City is revamping its thousands of pay phones as wireless access points. Stadiums and malls know they must provide reliable wireless access or risk losing audience and market share.
Unfortunately, for the most part the Federal government has been a laggard in addressing mobility through the adoption of wireless technology. Government employees typically enjoy better Wi-Fi connectivity at home than they do on the job. This has been mostly due to security concerns, but those concerns can be addressed by recent advancements.
Both of the main wireless technologies — 802.11 Wi-Fi and Long Term Evolution (LTE) cellular — have advanced in recent years. The technologies each have strengths that are very complementary. Wi-Fi has greater capacity and makes more efficient use of spectrum. LTE is excellent for voice communications and often offers a better user experience.
The latest evolution of Wi-Fi, 802.11ac Wave 2, dramatically increases data throughput to almost match that of a wireless laptop. It also increases multi-user support, enabling support for much denser deployments. LTE, sometimes called 4G, is being built out in more places by service providers, and there have been exciting developments around supplying LTE over unlicensed spectrum.
The most prominent of these efforts is known as OpenG. OpenG is supported by a coalition of companies including Google, Intel, Nokia, Qualcomm, Federated Wireless and Ruckus Wireless. Wi-Fi and cellular standards are converging, and OpenG is leveraging and standardizing the use of unlicensed spectrum made available by the FCC for the Citizen Broadcast Radio Service, or CBRS. OpenG combines coordinated shared spectrum capabilities with neutral host capable small cells that greatly improve in-building cellular coverage without any changes to the provider’s core network.
If the federal government does not start to offer more wireless options, users will be tempted to access less secure, open networks for connectivity.
The latest wireless technology allows for secure wireless networks to scale. For example, software-based controllers can be deployed to manage up to 30,000 wireless access points with secure authentication.
This would mean the Pentagon could run its own wireless LAN to improve phone coverage in its massive headquarters. When the cellular signal fails, the call is passed to the local wireless access point, which applies the necessary security policies to ensure that all end devices are secure.
Since Wi-Fi has become so ubiquitous in day-to-day life, users have become accustomed to searching out wireless networks. If the federal government does not start to offer more wireless options, users will be tempted to access less secure, open networks for connectivity. This could easily become a security vulnerability, so federal network providers need to get a wireless plan in place.
Ensuring your wired network keeps up with wireless innovations is another part of ensuring security. Embracing software-defined networking (SDN) provides the ability to manage both wired and wireless networks via one pane of glass.
Federal customers can leverage wireless technology that powers the leading networks in industry verticals such as hospitality and education. Agencies should look for contractors with a proven commercial track record. The wireless needs of hospitality in particular match those of the military, which needs to provide high performance wireless networks to densely populated environments such as barracks around the globe.
Wireless is the onramp to a modern network that serves mobile users. It’s time for government to take advantage of technology and best practices that have already been proven in the commercial sector. Wireless connectivity is essential, efficient and can be made secure with newer technology and security policy management.
Additional Resources on Mobility:
Customer Story: USS Midway Museum Turns Wi-Fi into Business Tool

SDN
How to Move Federal IT Forward with Software-Defined Networking
Today’s federal IT networks are using technology that is increasingly obsolete. To become more efficient and to better support the agency mission, federal IT must embrace the principles of software-defined networking (SDN).
In legacy networks, switches, routers and other network devices are managed individually, with the focus being on devices rather than applications. SDN abstracts the control from individual devices, giving administrators end-to-end visibility of network flows and the power to optimize traffic paths via policy rather than hardware. This freedom to programmatically control the way data flows through a network eases manageability, supports automation and helps administrators more quickly deliver the customized services that are critical to today’s dynamic environments.
An analogy for SDN could be a string puppet, or marionette. An older network would operate each string separately — one string for an arm, another for a leg, etc. SDN acts like the handle on top of the puppet, coordinating all of the strings at the same time. In SDN circles there is also the expression “taking the brains off the box,” meaning centralizing management rather than managing separate devices individually.
The current federal acquisition process is poorly suited for SDN adoption. In addition to being too slow, the process is also too derivative, attempting to use the same requirements and methodologies. Adopting SDN means starting with a clean slate and asking “what is needed” instead of just adding on an additional line item for SDN.
Adopting SDN means starting with a clean slate and asking “what is needed” instead of just adding on an additional line item for SDN.
Where SDN has been successful in Federal IT, the work has been done by small innovation teams supported by leadership. These teams have the skills, the passion and, just as important, the mandate for change. With SDN the best approach is to think big, start small and move fast. Often the best way to start is by automating provisioning, since that is a stable and well understood process that is contained yet also representative of the larger network.
SDN also requires embracing the primacy of open source for next-generation networking. According to the Linux Foundation, 90 percent of future networking advances will involve open source. Federal agencies need contracting partners with demonstrated leadership in the open source community. And beyond that, the federal end user must be comfortable working with open source, since it will be providing critical functionality in the network.
Federal IT leaders must also articulate clear success metrics for SDN. It can’t simply be treated as a cool, new science project. Be clear at the outset what you are trying to accomplish. A new network capability can be linked to greater network flexibility. Cutting time to deploy services increases agility, and a lower ratio of operators to devices can be a metric for greater efficiency.
Too often in IT the assumption is if something is new, then it has to be difficult. That is totally false when it comes to SDN. By simplifying the network, performance can be enhanced and new capabilities can be more easily developed.
In our daily lives, most of us no longer need to carry a day planner, a GPS device or a camera. A smartphone now does all of that for us, plus the flexibility to choose new applications as needed. That’s the kind of transformation SDN can power for Federal IT networks. Now is the time to start to implement software-defined networking to ensure your IT department is truly enabling your department’s mission.

DevOps
DevOps — The Right Tools Meet the Right Culture

There is no topic more buzz-worthy right now in IT circles than DevOps. In fact, a survey conducted early this year by MeriTalk found that 78 percent of Federal IT professionals think that DevOps will help their agency innovate and develop new services faster.
So what is DevOps, and why do so many people feel it has such potential to transform IT? DevOps is a culture of trust and collaboration in which people use the right tools for automation to achieve continuous delivery. A simple working definition would be “infrastructure as code”.
Here’s an example of why both tools and culture are prerequisites for DevOps. If you have cultivated a healthy, innovation-supporting culture but are still using command line interface (CLI) to manage your network, that is not DevOps, and you will fail. Conversely, if you have the most advanced tools but a culture where every time there is a problem the instinct is to back out the change instead of identifying the problem and quickly making the change, then that is not DevOps either, and you will fail.
DevOps is infrastructure as code, and automation is about executing workflows automatically. Both are important, but they are different, and you need to be clear on what your objectives are. Automation can precede the adoption of full DevOps; it is not an all or nothing proposition. It’s important to remember that you are not automating the network, you are automating doing things on the network. So clearly identifying your workflows is necessary to begin.
DevOps is a culture of trust and collaboration in which people use the right tools for automation to achieve continuous delivery.
Typical examples of workflows include troubleshooting and provisioning. Note that writing a script is not automation. Scripts only solve a keystroke problem. They can’t coordinate resources or exchange information. For automation to occur, you need three things to be true:
- • Elements need to be able to talk to each other, which requires a data distribution solution
- • Those elements must be able to understand each other, which requires a data normalization layer
- • Action must be defined as a series of “if this, then that” reusable logic blocks
Automation can extend far beyond routers and switches. Anything attached to a sensor can be automated — the source of the data becomes the catalyst.
Automation alone does not create a DevOps environment. To get started on DevOps, you need to:
- • Determine if your agency is the right culture for DevOps
- • Fight the temptation to start with tools — people first
- • Consider using your favorite tools to identify workflows
- • Make sure to involve more than the networking team. Bring in all the siloes. DevOps is a team sport!
With the right culture and tools in place, DevOps can transform Federal IT.
To learn more, visit Brocade.com

