Email
Password
Remember meForgot password?
    Log in with Twitter

article imageQ&A: What's the next development for enterprise architecture? Special

By Tim Sandle     Dec 31, 2019 in Business
One of the big business developments for 2020 is likely to be with enterprise architecture, and any related issues, including cloud migration together with M&A IT integrations. Expert Ned Gnichtel provides analysis.
Enterprise architecture refers to the processes required for proactively and holistically lead enterprise responses in response to disruptive forces by identifying and analyzing the execution of change toward desired business vision and outcomes.
The themes of IT design, enterprise integration and enterprise ecosystem adaptation are each big areas of business focus for 2020. To learn more, Digital Journal spoke with Crosscode's Director - Technical Strategy, Ned Gnichtel.
Digital Journal: How important is digital transformation for business?
Ned Gnichtel: The nature of technology in business today is that it is the business, even for companies that think of themselves as doing other things. In most cases, their ability to function is almost entirely dependent on technology.
Of non-technology companies, financial services and banking are largest consumers of technology in the private sector. From ATMs to transaction clearing systems, financial services organizations simply can’t function without the numerous, heterogenous and, often, multi-generational systems they’ve built.
However, this situation extends far beyond financial services. The legal profession relies on systems for automatic, increasingly machine learning-directed, discovery tools for litigators; document management systems; and legal query systems for looking up case law. Years ago, when you went to a law office, they often had a library full of books, and a large part of a paralegal's day was spent trying to find specific snippets of case law. For two decades, that's been largely accomplished through LexisNexis and other services.
Even in manufacturing, technology is intrinsically integrated into every facet of the business. Engineers and designers build entire products in CAD/CAM systems; computer driven finite element analysis allows engineers to understand how materials and structures will perform before the first physical prototype is built. In the late 1970’s, a North American automobile assembly plant employed roughly 15,000 hourly and salaried workers. In 2019, that same plant, with the same amount of output, employs about 2,000 hourly and salaried workers. It is massively automated. There are no welders on a modern automotive assembly line in America today, it’s entirely automated. It doesn’t stop there, either, the drive is towards “lights-out” factories, where virtually every aspect of assembly is automated.
Across the board, the footprint from a technology standpoint is not just important; it is deeply embedded inside everything we do. Digital transformation is something that already happened. What we're really talking about now is digital modernization: taking systems that have become ossified, difficult to support and have fallen behind the support curve. Architecturally, from an application standpoint, they might be monolithic -- or in many cases, they're not monolithic; they're just a mess of different generations of code and frameworks. The goal is to get systems to a more modern, more dynamic, more scalable, more extensible and more serviceable state.
So how important is digital transformation, if we are defining it as “modernization”? Absolutely critical, because the transformation has already happened. Now the question is how do you keep it working, make it work better and address the risks of how deeply dependent every aspect of our modern lives are on Technology?
DJ: Is cloud computing a business necessity?
Gnichtel: Cloud, as a marketing term, is used to describe a whole host of services, patterns, models, etc. If discussing public cloud, such as IaaS or PaaS offerings, it’s not a necessity to use such services, though it may have significant advantages, especially in enabling more dynamic scenarios. However, the patterns that modern computing dictates, meaning highly scalable, dynamic and extensible architectures, are absolutely a business necessity. Put simply, whether maintaining an on-premises private cloud, building a hybrid solution, or using public cloud, the patters and architectures should be largely the same.
The above statements need to be qualified against the backdrop of defining cloud properly. A lot of what is routinely labeled as cloud, is not cloud, it's just hosted services. To put it bluntly, Infrastructure-as-a-service is just a hosting model using various virtualization technologies at scale. Most public IaaS implementations feature nice virtual networking tools to build “virtual private clouds” for hosting applications in virtual machines. However, this is not substantially different than a well implemented on-premises virtualization solution. Simply taking a legacy application running in a VM on premises and moving it to a VM running in a cloud provider, doesn’t constitute a cloud migration, it’s merely outsourcing the hosting of the VM.
When we talk about true cloud from an application architecture standpoint, it's this notion that the code you're writing is patterned in such a way that it essentially doesn't care about the underlying infrastructure topology. It can be multi-instance, deployed on-demand in an idempotent fashion, can be scaled up and down and can run on any combination of deployment modalities from containerized to bare-metal. From an application design viewpoint, it's a set of patterns for non-monolithic, microservicized architecture that use very well-defined interfaces and standard protocols. Applications built using these patters should be easily serviced, deployed, maintained, monitored and scalable. Lastly, the pattern assumes a comprehensive DevOps pipeline, from application development through to deployed and monitored state.
True cloud is a world where "I write some code using well defined interfaces, I push it to wherever I'm pushing it, and it just works”. The implications of such a world, from a business standpoint, are massive.
DJ: What types of big data analytics are made possible with cloud solutions?
Gnichtel:That really depends on the architecture of the application. When we talk about mining data and analyzing it, in theory if you have architected your applications correctly, you should be able to apply data analytics without much hassle. The problem is that if we're looking at legacy systems, there's a big problem in terms of trying to normalize data. Just getting access to such systems, because they're often siloed around security barriers that exist between different organizational environments, is a huge issue when attempting to effectively use latent data.
For example, if you're in the marketing side of a bank and you're trying to do some data analytics around what your customers like to do from a checking account standpoint, and you think you're going to get easy access to the systems that have that information, cloud isn't going to do anything for you because you're not going to be able to get to them for a whole myriad of internal security reasons! To solve this problem, organizations must implement new data architectures that allow for moderated access control to enable data mining in ways that are safe. This is no easy task as it often requires completely rethinking the data architecture of such systems; there’s a whole area of specialization that has evolved around data architects and their skillsets.
There’s a great commercial, from one of the big cloud providers, where a scientist talks about how the cloud allowed them to perform massive and complex data analysis, and how this providers offerings essentially let them build a giant, on-demand loosely-coupled supercomputer. It's a great story, but the data itself had to be structured and architected in such a way that allows them to do that, but the first battle is centered around getting the data architecture correct.
Outside of the ability to rapidly provision tiered storage and compute engines for analytics, cloud doesn't really enable anything in this space, beyond an impetus to migrate to new patterns, thus potentially addressing the data architecture concerns.
Some of this sounds cynical, because basically the underlying theme is that the cloud is a lot of hooey! The reality is that the established technological evolutionary path is towards patterns that are that of highly scalable, and highly distributed systems, to which the “cloud” is driving us. Those patterns drive certain architectural design decisions, both in terms of infrastructural design and software architectural design. Put simply, the cloud is a bunch of computers. It's a bunch of computers with abstraction layers and some patterns that allow you to distribute your application and data across many instances. So the real battles that we're dealing with are really around how to do massively parallel distributed computing better. Anyone who's worked in the area of supercomputers, or has tried to deal with horizontally scaling workloads, understands these problems well.
We're in a world where, if you want your application to scale better, at every level we're dealing with massive levels of parallelism. Your personal computer today is not really a single CPU system; it's anywhere from 4 to 16 (or even more, depending on how powerful a machine you have). Just a basic x86 CPU has four on-chip processor cores. With symmetric multithreading, you have two state machines for each one, so it's eight logical processors from the operating system's point of view.
It's a huge problem getting heavily state dependent workloads efficiently distributed across all those processors. Go all the way up the stack and you're trying to build highly scalable applications that have to scale to millions of users, do so dynamically, so that one is only paying for what’s used, which is very important if you're in the public cloud. The shear complexity, top-to-bottom, of these issues is pretty mind-boggling. These are hard problems to solve; from thread optimizations for individual instances of a service, to architecting the data model to support hundreds (or even thousands) of instances, this is where understanding and adopting modern patterns is very important.
DJ: What are the security risks with cloud technology?
Gnichtel:The risks are severe, because “cloud” technology allows you to do things very quickly, especially if you're building a new application from the ground up. There are a lot of excellent tools and frameworks out there that allow you to build an application very quickly but, outside of some automated checks that are run by the developer tools, without much of the architectural review practices to validate decisions. We've seen this problem manifest itself in very serious security model shortcomings and, in some cases, outright deployment misconfigurations. We saw this with one large bank, where they had a misconfiguration in their Amazon VPC and, as a result, they had a huge publicly disclosed security lapse.
In the older siloed world of sitting behind a firewall inside a data center, where there were internal controls around how things were getting done, you could make the very same mistakes but generally, the repercussions weren't as severe. It's incredibly easy to stand up a couple of things in AWS or Azure and have them be wide open to the world. We see all kinds of problems with things like certain database technologies not being secure out of the box, and inexperienced developers configuring applications to do things that are not safe in terms of handling data. Because it's so easy to get these things up and running, and because the frameworks, language and developer environments have pretty good tools for checking certain basics, there's a whole other area of hygiene that is being ignored. For many large organizations, internal IT security controls haven’t been effectively extended to the cloud. Meanwhile, management is making broad assumptions that cloud security depends more on the skills and mechanisms of the provider, than conventional security best practices.
Also, I want to be clear, underlying cloud provider technologies do not lack security or security features. In fact, if anything, they have such a plethora of security options that it confuses people and subsequently leads to misconfigurations. But if done correctly, the underlying platforms themselves are decent. The real problem comes down to the fact that, when you have something where it's very easy to get something done rapidly, especially out of purvey of traditional IT security, you potentially run the risk of creating security problems.
DJ: What are the data privacy considerations?
Gnichtel:Again, you're dealing with this ability to build things quickly; it was probably a little bit slower in some of the older patterns, both in the application architecture standpoint as well as the fact that it took more time to even get something stood up in a legacy on-premises environment. But barring those security issues that come from all that and the fact that things can get stood up very quickly, the considerations are the same whether we're on-premises or whether we're in the cloud, regardless of what the patterns are. It’s imperative that organizations and architects understand what systems are accessing what sets of data. For many, if not most, I can tell you that this is distressingly misunderstood and badly documented.
Often, you will have systems where, two dimensions removed, one system is calling to another system, to get a piece of data that is potentially sensitive customer information, and the person who's in the middle doesn't even realize their system is being used in that fashion! Theoretically you'd hope that this wouldn't be happening, but in many environments, it is, and the lack of understanding around the real nature of application relationships and dependencies is one of the most serious problems facing architects and security professionals.
Frankly, I don’t think there is a set of specific considerations for the cloud world, but what I will say is that this is a problem across the board, whether we’re talking about legacy on-prem systems, private cloud, hybrid-cloud or public-cloud.
DJ: Is there a role for edge computing? How is this being applied?
Gnichtel:Yes, I think there's a big case for it. You run code where it makes sense to run it, especially when trying to address latency and user experiences. When talking about IoT and shortening paths for purposes speeding transactions, edge is a rather established pattern in the domains of many workloads. With the advent of increasingly microservice oriented architecture, the line between “back-end” cloud and edge is likely to become more blurred, since you may have reusable service instances running at various positions of the network, but the basic principles are the same.
Additionally, I think we’ll see greater integration between client, edge and backend services. We already see this with client-side applications using frameworks like Electron, where portions of MEAN stack edge, and even back-end services, are implemented in rich-client apps to enable low-bandwidth and off-line scenarios. In this scenario, the nature of the edge is somewhat nebulous.
To read the second part of our interview with Gnitchel, looking at predictions for 2020, see: "Q&A: 2020 will be about modernization of legacy applications."
More about enterprise architecture, Cloud computing, mergers and acquisitions
More news from
Latest News
Top News