Updated: 6 min 43 sec ago
Labor arbitrage and shared services companies have had a perfect marriage over the last 20 years. Then along came the Digital Revolution with new business models and a new construct for services. One component of the digital model construct is DevOps. It makes a significant impact on business services, but it’s important to understand how it changes the picture for labor arbitrage and shared services.
Shared service companies are structured on a functional basis. One way to think about them is they are a stack of functional expertise. In the case of IT, the stack includes such functions as infrastructure, security, application development and maintenance, and compliance. There is a multiple stack hierarchy, with each functional layer having shared service champions responsible for delivering that function cost-effectively at a high level of quality. Labor arbitrage fits perfectly into this equation in that each functional layer uses people, and the work can often be done more cost-effectively offshore than onshore.
This year Rackspace won the Red Hat Innovation Award in two categories: the Cloud Infrastructure and the 2017 Red Hat Innovator of the Year. What sets Rackspace apart from its competitors is the fact that it co-founded OpenStack, along with NASA.
Being there from day one also means Rackspace has seen both sunny and rainy days.
Talking of the sunny days. During the OpenStack Summit (Boston) Jonathan Bryce, the executive director of the OpenStack Foundation, talked about the second generation of cloud, remotely managed private cloud, and Rackspace has already been doing for a while now.
“That's where we've seen growth in our business,” said Thompson, “You can see other vendors in this space trying to provide this as a managed service or as a service type of an offering. Because that's the way I think people are looking to consume complex open source technologies.”
The European Organisation for Nuclear Research (CERN) is upgrading its wireless broadband network in order to support thousands of researchers using mobile devices while moving around its campus buildings.
There are more than 12,000 staff, visiting researchers and contract workers onsite at CERN's physics laboratory in Geneva each day, supporting projects such as the Large Hadron Collider. Around 20,000 mobile devices are used, requiring reliable wifi connectivity.
Having experienced problems with its independent wireless access points in the past, CERN decided to upgrade its network in 2015. "[The aim is to] enable seamless roaming in buildings across the campus and to get people in offices to give up their wired connections and be happy with wifi," says Dr. Tony Cass, who leads CERN's Communications Systems Group Information Technology Department.
A gamble on a relatively unknown technology four years ago is paying off for a logistics company, which is using the software to shave millions of dollars off its bandwidth connectivity costs. Today freight forwarding company JAS Global is leveraging a software-defined wide area network (SD-WAN) to run cloud applications, according to JAS CIO Mark Baker. Eventually, Baker hopes to use the SD-WAN as the backbone of a predictive analytics strategy to grow the business.
SD-WANs allow companies to set up and manage networking functionality, including VPNs, WAN optimization, VoIP and firewalls, using software to program traffic routing typically conducted by routers and switches. Just as virtualization software disrupted the server market, SD-WANs are shaking the networking equipment market.
An effective DNS (Domain Name System) infrastructure is a critical component of system uptime, which is essential to the viability and continuity of web services. For complex websites, a third of page load time can be attributed to DNS lookups. Inadequate or improperly configured DNS can have a potentially catastrophic impact on a company’s online presence.
Greg Downer, senior IT director at Oshkosh Corp., a manufacturer of specialty heavy vehicles in Oshkosh, Wisc., wishes he could tip the balance of on-premises vs. cloud more in the direction of the cloud, which currently accounts for only about 20% of his application footprint. However, as a contractor for the Department of Defense, his company is beholden to strict data requirements, including where data is stored.
"Cloud offerings have helped us deploy faster and reduce our data center infrastructure, but the main reason we don't do more in the cloud is because of strict DoD contract requirements for specific types of data," he says.
In Computerworld's Tech Forecast 2017 survey of 196 IT managers and leaders, 79% of respondents said they have a cloud project underway or planned, and 58% of those using some type of cloud-based system gave their efforts an A or B in terms of delivering business value.
Net neutrality is like a public park that anyone can use. ‘Pay-To-Play’ is a private club that only rich members use.
What happens to the internet when access isn't equal? (Or to paraphrase George Orwell in Animal Farm, "We're all equal, but some are more equal than others").
How could this impact consumers, businesses and non-profits?
“Ajit Pai, the chairman of the Federal Communications Commission outlined a sweeping plan to loosen the government’s oversight of high-speed internet providers, a rebuke of a landmark policy approved two years ago to ensure that all online content is treated the same by the companies that deliver broadband service to Americans”, reports the NYT.
Google and Microsoft are both benefiting from solid growth in enterprise cloud services. Both companies released earnings reports this week that highlighted significant momentum in G Suite, Office Suite, respectively, and more sophisticated cloud platforms for business.
“We crossed a major milestone with more than 100 million monthly active users of Office 365 commercial,” Microsoft CEO Satya Nadella said during a conference call. “Office 365 commercial seats grew 35 percent year-over-year and revenue is up 45 percent.” Microsoft also reported a 15 percent year-over-year gain in revenue from Office 365 and other consumer products. The company ended the quarter with 26.2 million Office 365 consumer subscribers.
Enterprises that try to slow down Microsoft's upgrade train by skipping one of the twice-yearly Windows 10 refreshes will have to hustle to stay in support, according to the company's latest scheduling disclosures.
Corporate users of Windows 10 may have as little as two months to deploy a feature upgrade after passing on the one prior. Only if IT administrators are willing to roll out a consumer-quality version -- one that Microsoft has not yet given the approved-for-business green light -- will they have up to six months to upgrade employees' PCs.
Those limitations come from Microsoft's latest pledge to support any given Windows 10 feature upgrade for 18 months, and the company's long-standing timeline on how it moves each upgrade from development to release, first to consumers and then to commercial customers.
Even the good news is bad news.
While Joshua Corman didn’t use that exact line in his opening keynote at SOURCE Boston this week, that was a pervasive, and sobering, theme.
Corman, a founder of I am The Cavalry and director of the Cyber Statecraft Initiative for the Atlantic Council, said he was there to tell some “uncomfortable truths” about the state of cybersecurity – among them that, “the critical infrastructure of our space is too big to fail, and it’s failing.”
He said the current statistics are depressing enough – that the database of CVEs (Common Vulnerabilities and Exposures), “which is the predicate for all of our intrusion detection,” holds only about 80 percent of those in existence, and that there is security “coverage” – blocking or detection technology – for only 60 percent of that number. “So you’re at 60 percent of 80 percent,” he said. “At best, you’re getting about 50 percent coverage of the knowns. When you make a risk decision, you’re doing it with a 50 percent blind spot.
The cloud has been in the news a lot lately, and mostly for bad behavior. It’s been slow, expensive, insecure or simply MIA — taking major corporations offline for hours and raising questions about the future of cloud computing.
Is cloud computing going away? Absolutely not, but a rapidly emerging new technology may mean that we won’t be stuck with our cloudy blues for long.
Imagine enabling organizations to leverage the benefits of both cloud and on-site IT, with the speed, resiliency, bandwidth and scalability to run existing workloads — regardless of location — and to power new technologies such as the Internet of Things (IoT) and machine learning.
Enterprises are adopting software-defined WAN to simplify branch office connectivity, improve application performance, and better manage WAN expenses, according to Gartner, which predicts that spending on SD-WAN products will rise from $129 million in 2016 to $1.24 billion in 2020.
“While WAN architectures and technologies tend to evolve at a very slow pace — perhaps a new generation every 10 to 15 years — the disruptions caused by the transformation to digital business models are driving adoption of SD-WAN at a pace that is unheard of in wide-area networking,” Gartner writes.
Two early adopters of SD-WAN shared some of the gains they’re realizing from the technology. The Bay Club Company and Autodesk are deploying SD-WAN technology from VeloCloud and CloudGenix, respectively, to transform the way they provision and support remote sites.
In December 2016, Steve Randich took the stage at Amazon Web Services’ (AWS) re:Invent conference to tell the story of how his organization made a daring move to the public cloud.
He ticked through a series of benefits to the Financial Industry Regulatory Authority (FINRA), the privately held, independent regulator of financial markets. One of them — the performance gain of 400 times — must have been mind-boggling to many (including the headline writer for the official YouTube video, which still states the gain as 400 percent).
All of the results were impressive, particularly because this wasn’t some low-value deployment. Instead, FINRA put its mission-critical market surveillance platforms on AWS, along with 90 percent of its data. In a time where many companies were flirting with the public cloud, Randich and his team were going all-in.
For such a seemingly obvious idea, Gartner ignited quite a firestorm with its proposition that, to remain relevant, IT must be broken into two distinct realms: one focused on keeping the lights on, or, in Gartner parlance, Mode 1, and one devoted to the cool stuff that business people want, or Mode 2.
Can an organization really cut development time more than 70 percent by embracing the agile philosophy and open architecture? The intelligence-gathering arm of the U.S. Air Force says it's done just that.
The Air Force's Distributed Common Ground System, a network of 27 surveillance and intelligence-gathering sites, projects that it will ultimately save hundreds of millions of dollars by moving to agile development, open architecture, and infrastructure-as-a-service, said Wes Haga, chief of mission applications and infrastructure programs at the Air Force Research Lab.
Industrial control systems (ICS) that run the valves and switches in factories may suffer from inherent weaknesses that cropped up only after they were installed and the networks they were attached to became more widely connected.FireEye iSIGHT Intelligence
The problems are as far ranging as hard-coded passwords that are publicly available to vulnerabilities in Windows operating systems that are no longer supported but are necessary to run the aging gear, says Sean McBride, attack-synthesis lead analyst at FireEye iSIGHT Intelligence and author of “What About the Plant Floor? Six subversive concerns for industrial environments.”
In the 18 months since the company split from its sister consumer business, Hewlett Packard Enterprise has been in an almost constant state of refining its strategy.
The company backed out of the public cloud market; sold off its Enterprise Services Business to competitor CSC for $8.5 billion; dealt other “non-core” assets to Micro Focus in an $8.8 billion deal; and dumped its OpenStack and Cloud Foundry development efforts off to Suse. HPE also bought all-flash storage vendor Nimble storage for $1 billion last year and snapped up hyperconverged infrastructure vendor Simplivity for another $650 million in January.
New and innovative security tools seem to be emerging all the time, but the frontline defense for just about every network in operation today remains the trusty firewall. They aren’t perfect, but if configured correctly and working as intended, firewalls can do a solid job of blocking threats from entering a network, while restricting unauthorized traffic from leaving.
Potentially difficult times
Image by Thinkstock
In tech, divestitures are a fact of life and solutions are bought and sold all the time. But that doesn’t change the fact that when it happens to a solution that your company uses, it can make things difficult for you. Although your vendor’s divestiture is out of your control, you can at least do your due diligence in limiting any negative impact to your company. With that in mind, ZL Technologies lists 10 things you should worry about if your vendor divests.
Organizing public rallies really is rocket science. A successful launch needs careful planning and good software. Especially when highly vocal users are involved!
How well does open source work for such a demanding application with hundreds of simultaneous events and millions of followers? How is information flow optimized and diverse applications integrated? Designing such systems where a small error could leave thousands stranded, isn’t for the faint of heart.
A team of progressive coders, open standards experts and veteran developers from Microsoft and Netscape show how it can be done. Their application integrates open source, cloud based infrastructure and content delivery networks to ensure that millions of people get to the right event at the right location, every time.