digital transformation Archives - SD Times https://sdtimes.com/tag/digital-transformation/ Software Development News Wed, 19 Oct 2022 15:38:31 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://sdtimes.com/wp-content/uploads/2019/06/bnGl7Am3_400x400-50x50.jpeg digital transformation Archives - SD Times https://sdtimes.com/tag/digital-transformation/ 32 32 Digital.ai announces newest version of AI-Powered DevOps Platform https://sdtimes.com/software-development/digital-ai-announces-newest-version-of-ai-powered-devops-platform/ Wed, 19 Oct 2022 15:38:31 +0000 https://sdtimes.com/?p=49304 Today, the digital transformation company Digital.ai announced the Banff release of its AI-Powered DevOps Platform. This release brings users expanded intelligence, automation, and collaboration in order to help companies accelerate their digital transformation.  According to Digital.ai, with this latest version, public sector and enterprise organizations can speed up delivery while still managing risks and focusing … continue reading

The post Digital.ai announces newest version of AI-Powered DevOps Platform appeared first on SD Times.

]]>
Today, the digital transformation company Digital.ai announced the Banff release of its AI-Powered DevOps Platform. This release brings users expanded intelligence, automation, and collaboration in order to help companies accelerate their digital transformation. 

According to Digital.ai, with this latest version, public sector and enterprise organizations can speed up delivery while still managing risks and focusing on driving value from software investments. 

The added intelligence capabilities work to aggregate customer data across the entire software development and delivery lifecycle, automatically creating a DevOps data lake to create a single source of truth for an organization’s investments from planning to production. 

“Software strategy now underpins every business strategy and business leaders need to navigate global economic uncertainty with agility, automation and security,” said Stephen Elop, CEO of Digital.ai. “We are truly excited to be delivering to our customers our latest release that removes the mystery and complexities in mapping software investments to topline business goals and priorities. Digital.ai believes that the future of enterprise DevOps is being able to turn data into actionable, predictive insights.”

Additionally, users gain real-time insights, historical trend analysis, and predictability in augmented analytic dashboards. This contextualizes data and allows for organizations to observe trends across the enterprise.

Users also gain access to a Hoylu whiteboard integration so users can utilize whiteboard-based collaboration with Hoylu’s visual and iterative approach to achieve better Agile practices, enhancements to the recorder for Testing, expanded application security, and more.

To learn more, visit the website

The post Digital.ai announces newest version of AI-Powered DevOps Platform appeared first on SD Times.

]]>
Do you use what you sell? https://sdtimes.com/softwaredev/do-you-use-what-you-sell/ Mon, 19 Sep 2022 14:40:49 +0000 https://sdtimes.com/?p=48922 Digital transformation can profoundly change not just a company’s technology, but its processes, culture, and people, too. As B2B solution providers, we should have deep insight into how our work affects the customer organizations we serve—and that insight starts in our own operations. What would you think of a marketing agency that didn’t understand its … continue reading

The post Do you use what you sell? appeared first on SD Times.

]]>
Digital transformation can profoundly change not just a company’s technology, but its processes, culture, and people, too. As B2B solution providers, we should have deep insight into how our work affects the customer organizations we serve—and that insight starts in our own operations.

What would you think of a marketing agency that didn’t understand its buyers? Or a consultancy that didn’t live and breathe its own advice? How about a chef that didn’t eat their own food? 

Not much, I’m guessing. And in this time of unpredictable change and always-escalating customer demand, improving your business processes in service of becoming an Autonomous Digital Enterprise is part and parcel to ensuring your customers’ future success as well.

“Taking your own medicine”—using the products and services you create— is vital to building trust with customers. This is especially true for companies who help their customers navigate digital transformation. Technology solutions have far-reaching impact on the ways employees work, and the way services are delivered. Treating your own internal stakeholders as your most important customer can help your business set itself apart from its competitors.

Four Benefits of Using Your Own Solutions 
  1. Identifying tangible business problems: When a company has identified an area for improvement, you can be confident that a market need likely already exists for that solution. 

  2. Improving implementation and integration: When your internal stakeholders are early adopters, they’ll be able to speak to their experience when making recommendations on how to implement and integrate new technology.

  3. Testing with skin in the game: Internal stakeholders need new products to work. Whether they work in customer support, or in other areas, fixing bugs and improving the product helps align their work to the goals of the company.

  4. Recognizing the real benefits: One of the biggest struggles when marketing a new product is getting referrals from customers. Even when customers love the product, they may be reluctant to speak about the vendors they work with for fear of disclosing proprietary information. When companies adopt their own software, they can document the benefits and offer first-hand experience. 
Learning from Experience

At BMC, we learned and applied some of these lessons first-hand. For example, a few years ago, we realized that our data volumes were growing extremely quickly as our customers adopted a new class of cloud-based, SaaS solutions. 

Like our clients, we pulled from a diverse array of data sources—everything from SaaS solutions like Salesforce to applications and databases like Oracle. Getting these sources into our data warehouse for loading, processing, and integration required many different tools and processes. Managing the numerous tools and custom scripting is where we felt the most pain. 

We ran into scalability limits, and the resulting consequences quickly escalated from an IT problem to a business problem. 

Data quality issues jeopardized people not being paid on time. Without validation, bad data wasn’t always apparent, sometimes causing jobs to fail or necessitating entire jobs be rerun. Our ability to complete quarterly close on time was threatened.

The solution came from our own automation and orchestration platform, Control-M. By allowing our teams to automate processes across multiple systems of record, Control-M made our data migration to the cloud possible, providing us with the scalability and elasticity in our data warehouse that modern business demands.  

By solving our own problems with data visibility and integrity, we can turn around and help our customers—most businesses are facing the same set of challenges we were as data and automation needs grow exponentially, and we’re now able to leverage our own internal learnings and improvements to find new opportunities to help customers. 

Building Credibility

Marketing departments spend millions of dollars to find new ways to understand their customers. They gather intelligence. They conduct focus groups. They assemble panels of customers as advisory councils. These activities are certainly designed to help sell, but they’re also used to help map out what’s next in the product development strategy. 

All of these are valid strategies to solving customer problems more effectively. But how often are you offered an opportunity to walk in the shoes of your customers? When you use your own solutions and services, and when your internal stakeholders understand them thoroughly, you develop the empathy that helps establish credibility with your customers, and you get the insights that empower you to deliver customer-centric value. Together, those two qualities will distinguish you from the competition and ensure a long-lasting customer relationship that continues to offer reciprocal value.

The post Do you use what you sell? appeared first on SD Times.

]]>
Digital Transformation Success Demands Digital Assurance: A Manifesto https://sdtimes.com/digx/digital-transformation-success-demands-digital-assurance-a-manifesto/ Tue, 16 Aug 2022 17:48:36 +0000 https://sdtimes.com/?p=48609 Digital transformation continues to change the way organizations operate. With an ever-increasing reliance on data, connectivity, and innovative technologies, digital trust remains at the forefront of the global economy. A secure, trusted digital environment correlates with higher levels of digital maturity, as shown in a recent HBR study. Having confidence in data enables business leaders … continue reading

The post Digital Transformation Success Demands Digital Assurance: A Manifesto appeared first on SD Times.

]]>
Digital transformation continues to change the way organizations operate. With an ever-increasing reliance on data, connectivity, and innovative technologies, digital trust remains at the forefront of the global economy. A secure, trusted digital environment correlates with higher levels of digital maturity, as shown in a recent HBR study. Having confidence in data enables business leaders to make decisions they can truly stand behind. 

Quality Testing vs Digital Assurance

Historically, businesses have relied on quality testing and quality assurance (QA) to ensure trust. While these practices are often lumped together, they are quite distinct. With testing, we check if the software is working as intended. Testing is a way to gauge design, capability, and function. Assurance, on the other hand, ensures quality outcomes, meaning that developed software corresponds to specifications. QA is used as a preventative measure during the software development process to prevent bugs. 

As enterprises continue to evolve, user experience and customer satisfaction are key when measuring the success of a business. Therefore, it’s important to move from testing and QA to the practice of digital assurance (DA). It’s not a complete abandonment of testing or QA, but rather an approach that builds on their techniques; assessing the business and ensuring smooth interactions between digital ecosystem components.

Accelerating Digital Transformation with Digital Assurance

Digital transformation initiatives are implemented to accomplish certain business goals and digital assurance helps to ensure success of these objectives through testing. However, digital assurance does not exist for the sole purpose of simply assessing projects, but rather it’s about enhancing and driving the perfection of the experience. Delivering a seamless customer experience and remaining competitive within the crowded business landscape are just a couple of reasons behind the push for digital assurance.

With that, businesses are becoming increasingly aware of the importance of a comprehensive digital assurance strategy. Having a digital assurance strategy helps businesses with: 

  • Their acceleration efforts in getting to market
  • Application performance through the monitoring of server-side and performance issues
  • Detection of defects early utilizing aggressive testing across cycles.
  • Reliability and availability as security vulnerabilities are exposed using dependency solutions
  • Future-proofing of application to guarantee performance, scalability, and resiliency 
  • Having a strong digital assurance strategy undoubtedly champions digital transformation initiatives, drives favorable business outcomes, increases business efficiencies, and helps deliver exceptional user experiences—all of which help businesses to thrive within an environment propelled by innovative technologies.
Finding Opportunities through Digital Challenges

The emergence of digital transformation has triggered a seismic shift in priorities for business and IT leaders. The reality facing digital businesses today is that with the benefit of increased business agility comes the pressure to deliver with precision, accuracy, and speed. Digital assurance techniques can be deployed to ensure a digital initiative is working as intended.

For instance, many organizations struggle to connect new solutions to their legacy systems. A digital assurance approach would start by utilizing the specialist test services you would expect in a QA scenario like performance, regression, functional, mobile, compatibility, visual, security, Salesforce, IoT, service virtualization, and wearable testing.

However, these traditional techniques are rarely enough to keep pace with innovation and provide the amount of coverage that is now required. Specialists digital engineering companies can help. Digital engineering specialists are able to further a digital assurance approach with quality engineering to help businesses adjust their software development and delivery processes to achieve consistent quality. They are also capable of expanding test practices to include automation and AI solutions to boost the scale, speed, and accuracy of the whole operation. 

How Customers are Driving Digital Transformation and Assurance

Customers drive digital transformation, and when it comes to user experience, their expectations are indeed high. Businesses have come to the realization that to achieve digital transformation success, it’s on them to deliver seamless customer experiences.

Today’s customers are tech-savvy and are accustomed to instant gratification when it comes to their experiences. Mobile devices, apps, automation, and other technological advances fuel the mindset of the customer. At every point of interaction, whether it be through marketing, sales or customer service, businesses must take a proactive approach to interactions, thus driving the need for digital transformation.

As noted, digital transformation goes the way of digital assurance. As businesses continue to interact with customers, the process needs to be rooted in quality measures that repeatedly test initiatives from inception to deployment. Simply, quality assurance can help ensure a great customer experience. 

Instead of being reactionary to problems, businesses employ practices to help anticipate issues to reduce errors and increase customer satisfaction. One such practice, chaos engineering, is meant to reduce unwanted scenarios and identify failures before progressing to outages. 

Injecting failures in systems allows for the exploration of system resilience and the examination of engineering. With this, engineers can authenticate behaviors early and make necessary adjustments before the problems manifest and lead to disruption, which can decrease UX satisfaction.

Improving DX with Digital Assurance

It’s fair to offer up the notion that innovative technologies and digital transformation lead to complexities for many businesses. In order to meet business goals and elevated customer expectations, organizations are increasingly shifting their focus towards digital-centric business models with the goal of achieving digital transformation. But the unfortunate fact of the matter is most businesses will fail. Digital assurance can help. 

Digital assurance helps businesses to accelerate their digital transformations by ensuring success of digital initiatives through testing. Digital assurance can also be applied to elevate customer experience, which is paramount for achieving DX. 

A digital assurance strategy that puts customers at the center of the digital transformation journey is key to future-proofing your business

The post Digital Transformation Success Demands Digital Assurance: A Manifesto appeared first on SD Times.

]]>
Transferring workload automation is one of the most difficult parts of cloud migration https://sdtimes.com/cloud/transferring-workload-automation-is-one-of-the-most-difficult-parts-of-cloud-migration/ Wed, 06 Jul 2022 16:41:13 +0000 https://sdtimes.com/?p=48187 Enterprises need workload automation to connect all of the business processes of an organization together, but tool overlap has resulted in a complicated web that is difficult to break free from when trying to migrate.   While workload automation is one of the older categories within automation, the space is now evolving towards more consolidation as … continue reading

The post Transferring workload automation is one of the most difficult parts of cloud migration appeared first on SD Times.

]]>
Enterprises need workload automation to connect all of the business processes of an organization together, but tool overlap has resulted in a complicated web that is difficult to break free from when trying to migrate.  

While workload automation is one of the older categories within automation, the space is now evolving towards more consolidation as a result of this complexity, according to Cem Dilmegani, founder and chief customer officer at AIMultiple, an AI industry analyst firm. 

He added that large companies have many different enterprise resource planning (ERP) setups on cloud or hybrid models. ITOps, then, has to rely on different vendor solutions and internally modified tools to keep the whole patchwork together. 

“The more tools that you have, the more complexity you have not only in terms of contracting but also in terms of training and getting the team to use the tool effectively,” Dilmegani said. “To be able to capture new crowds, vendors are adding new functionality and using new terminology in their marketing, but most of the time, because one of the primary benefits of this is getting to consolidate your ITOps automation at scale, we don’t see companies say ‘OK, for this specific batch of tasks I’m gonna use this and for this other batch, I’m gonna use this.’ They try to go after one vendor that offers them all sorts of capabilities.”

Previously, organizations used some sort of workflow orchestration or automation capabilities but haven’t used the tools to their full potential, he said.

“To have an enterprise, you pretty much need to be running these sorts of operations and in terms of the industry penetration of typical workload automation, I’m not sure I would expect penetration levels to be on the higher side,” Dilmegani added. “But what I also see is enterprises are transitioning from some complicated setup to a simpler setup and they are working with fewer vendors and fewer internally developed technologies and migrating to a place where they can pretty much offload much of the complex things to a new piece of software. So there is an opportunity for the simplification of versatile automation environments.”

Once workload automation is set up correctly, enterprises can benefit from a wide range of tasks from copying files from one location to another, to more complex tasks like provisioning and configuring new servers. The business tasks can then be viewed through a single application that can be managed by IT departments across physical, virtual, and cloud environments.

Slipping over to SOAPs

Because traditional job scheduling and automation tools failed to keep pace with the complexity of digital businesses, workload automation now has shifted over to Service Orchestration and Automation Platforms (SOAPs), according to BMC Software in a blog post. Gartner predicts that “through 2024, 80% of organizations using workload automation tools will switch to SOAPs to orchestrate cloud-based workloads” in the Market Guide for Service Orchestration and Automation Platforms.

SOAPs offer application workflow orchestration to create and manage workflows across multiple applications, event-driven automation to simplify IT processes, self-service automation, and many more capabilities. 

The field of workload automation can be rather complex to grasp as vendors are using new terminology to describe similar technologies. There are tens of categories of automation tools and their capabilities tend to overlap, Dilmegani said. 

Although it bears many names, IT-based workload automation technologies are different from things like robotic process automation (RPA). While both aim to automate work, RPA is typically used to automate tasks within a single application, while workload automation is used to automate tasks across multiple applications. It’s built to handle much more complex tasks and architectures that have spawned as a result of the move toward microservices and Kubernetes. 

Leveraging workload automation for ETL processes reduces time spent on repetitive data processes, and minimizes human intervention, reducing subsequent data errors. Also, automating data warehouse management through workload automation tools increases the transparency of compliance reports as all processes are recorded and have a detailed audit trail. Lastly, it reduces the number of FTEs hired to complete repetitive tasks, according to Dilmegani.

“The ultimate goal of the workload automation is to eventually have end-to-end control over processes that involve different types of IT or business tasks,” said Alexandra Thurel, the director of product management for automation and solutions at HCL. 

A critical piece of transformation

The workload automation tool is now considered a critical element of the infrastructure that is moving to the cloud or Kubernetes architecture during digital transformation, according to Thurel. 

“[Companies] look at evolving from on-premises complex application and rehosting or rewriting their application in the cloud, and they need to have the layer that helps orchestrate those new applications with the rest of the world because not all applications are going into the cloud at the same time,” Thurel said. “So they will need to ensure that some file transfer or database inventory that still runs on-premises is connected with the processes that are newly created in the cloud. They need to manage entry points and exit points between the processes.”

Companies that are looking towards digital transformation are either still investing in legacy systems that can be distributed, or they’re looking to readjust and re-architect with a lift-and-shift type of approach to the different applications to run on the cloud, or are looking to rebuild and reinvent their applications to become cloud-native. All three of these strategies have one thing in common.

They all have business processes that interconnect with different platforms and heterogeneous systems that bring together challenges and risks. These application workloads are no longer sitting in predefined data centers and are now spread across multiple clouds which is where workload automation becomes essential, according to Thurel. 

Organizations need to embrace a systematic approach, avoiding islands of automation where each context is being managed by a different tool. Organizations also need to manage their data flows as more data becomes available. Here, the file transfer capability that workload automation excels at is becoming more important. 

Some of the workload automation tools out today leverage historical workload execution data with AI to expose observable data and provide an enhanced operational experience. For example, HCL Workload Automation can optimize data transfers and processing by leveraging a single point of control and integration for MFT, RPA, and big data applications.

Expanding automation tasks

When done right, companies often find that workload automation doesn’t just stop being useful at one specific task.

They initially look at specific objectives such as improving a paycheck process or improving inventory management with workload automation, but then they soon realize that they can expand those processes within the ecosystem of applications that flow around that application, Thurel added. 

Their needs are changing too. Previously, companies wanted to have a control point that ran on-premises in their data center and today people want to have that control closer to their new application. Their workload automation now helps them gain observability into their cloud environment, or even against Kubernetes standards to allow for more flexibility, more scalability, and higher speeds.

Vendors now offer new orchestration flexibilities that enable users to define very precise modeling of their processes and users can define where they want to have the point of control.

Workload automation can map automation on specific control points that matter for the business. Then if a job fails, there is an action that can quickly fix that problem and continue to execute business processes. The tool can suggest to users where executions have a high risk of failing because it takes data from the millions of jobs that are executed every day. If there’s an anomaly, the intelligent system can inform the user how to act preventatively. 

Despite the benefits, moving to a new platform along with workload automation is not an easy task.

If a company makes mistakes during reporting, it could be getting taxed higher, or it could be reporting lower revenues along with all sorts of issues. And then there could be outages on the ITOps side which could break the business if the issues are not reported in a timely fashion. 

Workload automation is one of the most difficult things about migrating to different platforms and is therefore one of the biggest obstacles, according to Dilmegani. 

“The workflow automation domain is much more at the core of your business. And, it’s also a bit less known because it’s sort of done in a back office,” Dilmegani said. “But it’s stuff that shouldn’t break, and that brings some risk aversion with that, which is why you are ending up with a complex landscape. Today, there are plenty of opportunities for most companies to simplify.”

 

The post Transferring workload automation is one of the most difficult parts of cloud migration appeared first on SD Times.

]]>
Optimize data transfer and integrate file transfer in your automation workflows https://sdtimes.com/data/optimize-data-transfer-and-integrate-file-transfer-in-your-automation-workflows/ Fri, 01 Apr 2022 13:00:01 +0000 https://sdtimes.com/?p=47101 Workload automation is a critical piece of digital transformation. It can enable practitioners to schedule and execute business process workflows, optimize data transfer and processing and cut down on errors and delays in execution of the business processes themselves.  Businesses currently have three main approaches to modernization and digital transformation. One is that they are … continue reading

The post Optimize data transfer and integrate file transfer in your automation workflows appeared first on SD Times.

]]>
Workload automation is a critical piece of digital transformation. It can enable practitioners to schedule and execute business process workflows, optimize data transfer and processing and cut down on errors and delays in execution of the business processes themselves. 

Businesses currently have three main approaches to modernization and digital transformation.

One is that they are in some cases still investing in legacy systems that could be distributed. The second approach is that businesses are looking to readjust and re-architect with a lift-and- shift type of approach to the different applications to run on the cloud. Lastly, they are looking to rebuild and re-invent their applications to become cloud-native. 

All of these different strategies have a common factor: the business processes are interconnected with the platforms and the heterogeneous systems that bring together challenges and risks. 

“Application workloads are no longer sitting in predefined data centers and are now spread across multiple clouds, bringing a challenge that they need to be managed and mitigated,” said Francesca Curzi, the HCL software global sales leader for workload automation, mainframe, and data platform. 

Customers need to embrace a systematic approach, avoiding islands of automation where each context is being managed by a different tool. Organizations also need to manage their data flows as more data becomes available. Here the file transfer capability is becoming more and more important to be really interconnected, Curzi added. 

The new HCL Workload Automation v.10 launched on March 4th offers unique technology to enable this kind of digital transformation and to tackle these challenges. It can execute any type of job anywhere: on-premises or on the cloud on the cloud of one’s choice. The tool leverages historical workload execution data with AI to expose observable data and provide an enhanced operational experience.

“It removes these islands of automation across different applications and brings unique capabilities with advanced models into the market,” said Marco Cardelli, HWA lead product manager.

HCL Workload Automation can optimize data transfers and processing by leveraging a single point of control and integration for MFT, RPA, and big data applications. 

Schedulers and operators will benefit from the tool’s flexibility and executives can feel safer with a robust long-time market leader technology that takes care of business continuity.

All of the plugins that come with the new version provide a way to orchestrate different applications without needing to write a script to manage them. Users of Workload Automation v.10 have a doc plugin panel in the web user interface to define specifically what kind of job they want and they just have to provide parameters to orchestrate it. 

The solution offers ERP integrations such as SAP, Oracle E-Business, PeopleSoft and big data integrations like Informatica, Hadoop, Cognos, DataStage, and more. It offers multiple ways to manage message queues, web services, restful APIs, and more. 

Last, but also very important, HCL is also automating some RPA tools, offering the possibility to orchestrate the execution of the bots, in particular on Automation Anywhere on Blue Prism, as well as IBM RPA planned for this year. 

Users will also benefit from AI and ML capabilities. Version 10 offers anomaly detection and identification of patterns in the workload execution.

“In the future, we also want to take care of noise reduction related to alerts and messages of the product to help our operators to fix job issues, providing root cause analysis, and suggest self- healing based on historical data, and to also improve the usability of the dynamic workload console by allowing AI to help customers to define objects to find features and so on,” Curzi said. 

There is also a new component called the AI Data Advisory available for containers. It uses big data machine learning in LED analytics technologies on Workload Automation data and provides anomaly detection. At that point, it’s possible to use a specific UI that provides historical data analysis for jobs and workstations, empowering operators.

With digital transformation, organizations can take advantage of the most advanced workload scheduling, managed file transfer, and real-time monitoring capabilities solution for continuous automation. In addition, organizations can keep control of their automation processes from a single point of access and monitoring! For more information, click here.

Start your 90 day free trial and get hands-on experience with a one-stop automation platform, click here.

Content provided by SD Times and HCL Workload Automation

The post Optimize data transfer and integrate file transfer in your automation workflows appeared first on SD Times.

]]>
Remember ‘people over processes’ https://sdtimes.com/softwaredev/remember-people-over-processes/ Fri, 11 Feb 2022 17:30:25 +0000 https://sdtimes.com/?p=46594 Here we are, late January, with a “bomb cyclone” weather pattern about to drop a foot or more of snow on us here in New York. What better time to hunker down and reflect on the last year and to determine what is really important in the year ahead? Many – if not most – … continue reading

The post Remember ‘people over processes’ appeared first on SD Times.

]]>
Here we are, late January, with a “bomb cyclone” weather pattern about to drop a foot or more of snow on us here in New York. What better time to hunker down and reflect on the last year and to determine what is really important in the year ahead?

Many – if not most – of the technology conversations we had in 2021 centered on two things: Digital transformation and the speed it can enable; and the fact that despite advancements in AI and RPA (see the article in this issue), work remains about people – both those that are doing the work, and those on the receiving end of that work.

Of course, the virus that shall not go away has changed much in the world. Many organizations have allowed a hybrid remote/in-office approach to work, or shut down completely, and many workers are struggling with mental health issues due to the isolation that quarantines and face masks have created. Of course, there are the outside stresses of climate change, impending war in Eastern Europe and, in America and elsewhere, deep and widespread divisiveness over issues that affect our daily lives.

Largely because of these new conditions, workers are feeling more stress than ever to just maintain, while still being effective in their jobs and able to deliver valuable work to their organizations. 

Much of that stress comes from organizations responding to the constant drumbeat in the media and among software vendors that transforming the organization to move faster, crank out more features and just go-go-go is the only way for businesses to survive. While this may be true for the largest organizations, which by their sheer size and influence drive the narrative, many midsize and smaller companies are competitive and doing well with the processes and tools – and most importantly the people – that they’ve had in place for years. Their markets aren’t changing rapidly, and the need to pivot and react to every new initiative coming down the pike just isn’t there.

Adding to worker anxiety is the fact that they are being asked to do things they haven’t been trained to do and likely don’t have much desire to do. Developers, for instance, are being asked to become test engineers, and take responsibility for security, all of which takes time from what gives them satisfaction on the job – writing code, innovating, coming up with new approaches to problem-solving, and creating wonderful new features for their users. Some workers are embracing the new challenges of learning new skills. Others clearly are not.

They also have to deal with a massive influx of new tooling into their organizations. There has been an explosion in tooling in organizations, so on top of everything else they need to learn, developers also have to learn the new tooling. Again, more time being taken from coding.

Much of technology is about tools and solutions, for automation in testing and continuous improvement. But what we’re hearing from more than a few developers is that they feel like they’re just another cog in the wheel. that their concerns and desires aren’t being heard, and that their organizations are moving people around so much that they just can’t get comfortable.

All this has played a role in what is being called “The Great Resignation.” People are leaving jobs in record numbers. Some are simply looking to be more highly compensated than they have been because of the shortage of tech workers. Others are seeking meaning in the work, flexibility to set their hours, and – as more are working from home –  a balance of work life and family time.

It’s time for the industry to take a step back and analyze the true cost of blindly going for speed. Workers are dissatisfied, burned out and looking for a better way. Some take solace in working on their own projects on their own time. Others clear their heads through video games. Many are wondering if they’re simply pawns in a corporate game.

There has been somewhat of an awakening to these problems, as we’re hearing software companies starting to place their people above their processes. (Where have we heard that before??) We seem to have forgotten that in the never-ending race to the top. You can have all the tools and automations in place to deliver software like the wind, but it’s the people who ideate, create, innovate and execute who should be prioritized above all. 

The post Remember ‘people over processes’ appeared first on SD Times.

]]>
Digital architects rose to the challenge – So must databases https://sdtimes.com/data/digital-architects-rose-to-the-challenge-so-must-databases/ Fri, 15 Oct 2021 18:37:19 +0000 https://sdtimes.com/?p=45563 The COVID-19 pandemic has accelerated some digital aspirations and projects, and slammed the brakes on others. A recent Couchbase report reveals that roughly a third of companies interviewed are delaying legacy database upgrades because of the pandemic. A slightly larger group, some 34%, say it has had the opposite effect. Architects bear much of the … continue reading

The post Digital architects rose to the challenge – So must databases appeared first on SD Times.

]]>
The COVID-19 pandemic has accelerated some digital aspirations and projects, and slammed the brakes on others. A recent Couchbase report reveals that roughly a third of companies interviewed are delaying legacy database upgrades because of the pandemic. A slightly larger group, some 34%, say it has had the opposite effect.

Architects bear much of the responsibility for delivering on these digital aspirations, with the survey revealing that 48% of this group are currently under high or extreme pressure to deliver digital projects – up from 19% a year ago. Fortunately, the numbers also reveal that almost half are on time with their plans. We can surmise that architects are rising to the challenge.

The same cannot be said of the technologies they rely on, however. According to the 450 architects interviewed, an incredible 86% said it was now harder to get the right technology in place for their projects. Legacy databases, in particular, are a major sticking point. Of the respondents, 91% still rely on legacy databases, while 61% say legacy databases make it harder for them to implement new projects.

The pain is real: more than 60% of companies plan to reduce their reliance on legacy databases, and nearly as many are already doing so.

Modernization at stake

The results of the survey should give pause because they shine a spotlight on a fundamental challenge with software projects. Namely, that the ability to deliver services depends on the underlying technology stack’s support. If that support isn’t present for whatever reason, the rest becomes exponentially more challenging and sometimes impossible to achieve.

It could explain why even though pressure and output among architects has more than doubled, only about half of projects are still on time. Again, we come back to that revealing case in point – the high dependence on legacy technology, and that nearly two-thirds of architects cite these databases as clear barriers.

There is more evidence in the report, including that 61% of companies say legacy databases make it harder to implement new projects. Yet 91% still rely on relational databases, even though they admit those databases do not have sufficient potential to enable modernization to the extent required.

The database dilemma

Why is that the case? When asked which technologies have the most potential for improving modernization, the Cloud leads (68%), followed by Big Data (59%), and Automation and Artificial Intelligence next at just over 50% each. IoT, Mobile Applications, and DevOps also rank very high.

Traditional relational databases settle at just below 30%. Yet, weighed against the facts, this likely has more to do with the comfort of a known technology than its actual potential. Indeed, two-thirds of respondents consider relational databases to be a barrier to their success, because of their inflexibility. 

If we account for other research findings, the sentiment leans even more towards modern databases. Gartner predicts that by 2022, 75% of all databases will have migrated to a cloud platform. The database market reflects this trend: in 2018 the database management market grew by 18.4% (accumulating a value of $46 billion), and cloud database management systems generated 68% of that growth. A report from Adroit Market Research has similar predictions for cloud database adoption, noting that “the growing need for self-driving cloud databases within the enterprise is one of the major factors anticipated to drive industry growth.”

The case against legacy and on-premises-only databases strengthens when one looks at the reasons people want to move away from them. When asked what would make it easier to reach their goals, 65% of respondents said moving from on-premises databases to the cloud would help, while 49% of digital architects noted that moving to NoSQL databases have been significantly helpful in reaching their goals.

If we take a step back to survey the current software landscape, the fundamentals are clear. Networks, storage, applications, and the other building blocks of software have been undergoing shifts to accommodate today’s modern requirements. Even PCs have become virtualized. It makes complete sense that databases should be a key part of this change.

The findings of this report shines a light on conflicting expectations and experiences among architects: people want to move forward, yet they concede that factors such as legacy databases are holding them back. Fortunately, NoSQL databases are providing better and easier ways for architects, developers, and everyone in the organization to make the switch more effortlessly and seamlessly than ever before. One might argue that legacy databases are the last thing standing in the way of modernization.

 

The post Digital architects rose to the challenge – So must databases appeared first on SD Times.

]]>
3 common process automation mistakes (and how to fix them) https://sdtimes.com/softwaredev/3-common-process-automation-mistakes-and-how-to-fix-them/ Mon, 30 Aug 2021 15:35:39 +0000 https://sdtimes.com/?p=45114 Like their cloud-native counterparts, many large or longstanding enterprises aspire to automate as much of their operations as possible. As a result, many of them get overly ambitious with their process automation goals, and attempt to roll out sweeping, company-wide digital transformation initiatives. While ambition is a good thing, many of these initiatives take years … continue reading

The post 3 common process automation mistakes (and how to fix them) appeared first on SD Times.

]]>
Like their cloud-native counterparts, many large or longstanding enterprises aspire to automate as much of their operations as possible. As a result, many of them get overly ambitious with their process automation goals, and attempt to roll out sweeping, company-wide digital transformation initiatives. While ambition is a good thing, many of these initiatives take years to complete, and often require ripping and replacing legacy systems. 

Few organizations consider that end-to-end process automation takes a change in mindset that spans people, processes and technology. Let’s take a look at three of the most common process automation mistakes, and how organizations can work together to fix them.

  1. Rolling Out Strategic Automation Initiatives Too Fast

While there’s nothing wrong with being strategic, thinking on too large of a scale is a common pitfall of overly ambitious automation projects. Taking on too much strategic work too early runs a high risk that the organization doesn’t see any business value for a long time. As a result, developers will most likely get completely stuck in shaping a complex platform without understanding its use case.

Instead, try to break down larger strategic initiatives into component parts, starting with the most urgent or important projects first. Here’s one way to approach it:

  • Start with Pilot Project: The goal of this project is to define and validate both architecture and stack. Very often, this pilot project is set up as a proof-of-concept (PoC). However, it is important to go-live with that pilot to really learn about all aspects of the workflow solution throughout the full software development life cycle (SDLC).
  • Accelerate to a Lighthouse Project: Soon after running a successful pilot, you should tackle a lighthouse project. This project should have a broader, but still realistic scope which can be better leveraged to show off architecture, tooling, and value of workflow automation to other people and teams within your organization.
  • Progress to Broad-Scale Transformation: Leverage the lessons learned from the lighthouse project, empowering the people on that project team to run a Center of Excellence (CoE) to break down silos across teams and drive organization-wide change.

I call this approach “the art of gradual transformation.” Ideally, before approaching a large-scale automation project, try to map out the entire ecosystem of processes — including the people, systems and devices at work in the background. Start by modernizing high impact processes that affect customers the most. Then design a transformation approach that fits the business’ or customers’ needs, rather than your technology stack’s requirements.

  1. Handling Automation Projects in Silos

Even though a gradual transformation approach is recommended, it does not mean “siloed” or without structure. If each team chooses its own tools, it can be hard to effect organization-wide change, or end-to-end process automation. Technology decisions are a commitment for years and sometimes even decades. These decisions and the resulting maintenance affect more than just the current team in the trenches.

As mentioned above, a CoE approach can help break down organizational silos and share best-practices on what has worked or not worked in previous automation projects. Ideally, this group does not dictate arbitrary standards, but maintains a list of approved tools and frameworks that can be reused across the entire company. 

Beyond tooling alone, a CoE can also maintain start guides, project templates, and reusable open source components/libraries for teams to leverage. In addition, they can serve as advocates for automation, by running a community to raise awareness for new automation initiatives within the company. Within this framework, more teams can get inspired by the potential for automation within their departments.

  1. Failing to Embrace Microservices Architectures

One important factor to address is the way software is built within the company. Embracing a microservices architecture in a legacy company is easier said than done. Often, there are legacy systems in place that are difficult to unseat. By one estimate, there are still more than 200 billion lines of code written in COBOL, a decades-old programming language. A wholesale replacement of these systems could cost upwards of $4 to $8 trillion (or more).

That’s where the gradual transformation approach comes into play. For example, many companies have surface-level automations in place with RPA implementations sitting on top of legacy systems (like those written in COBOL). A good approach for these scenarios would be to go through a modernization in three main stages:

  1. Orchestrate all of these RPA bot-driven local task automations along the end-to-end business processes
  2. Sunset these bots one by one in order of priority
  3. Invest in rewriting the underlying business logic as microservices, which can again be orchestrated along the end-to-end business processes.

The advantage of a microservices-based automation workflow is that it allows for a decentralized architecture where each team “owns” its own isolated processes. In the event something goes wrong with a single process, it can be easily controlled and fixed. From there, a process engine can “drive” these microservices-based processes across the organization, and unify them where it makes sense.

To sum up, end-to-end process orchestration can’t happen in a vacuum. Stakeholders from across the organization should be involved, and projects should start small. Without a clear pilot project, large-scale, strategic automation efforts can easily fail to prove their business value. By working together to define priorities, create best-practices, and roll out the technological changes needed, organizations can ensure that end-to-end process automation happens successfully.

To learn much more about process automation, sign up for the virtual CamundaCon 2021 event on September 22-23.

The post 3 common process automation mistakes (and how to fix them) appeared first on SD Times.

]]>
How transformation works in practice https://sdtimes.com/ab-testing/how-transformation-works-in-practice/ Tue, 10 Aug 2021 16:32:30 +0000 https://sdtimes.com/?p=44989 Transformations take time. People think you can bring in a tech transformation coach, change everybody’s job title, and get them on a quick “sheep dip” of a certified scrum team member. But in organizations, transformation happens incrementally. The good news is that the benefits of transformation can start to be delivered straight away. The goal … continue reading

The post How transformation works in practice appeared first on SD Times.

]]>
Transformations take time. People think you can bring in a tech transformation coach, change everybody’s job title, and get them on a quick “sheep dip” of a certified scrum team member. But in organizations, transformation happens incrementally. The good news is that the benefits of transformation can start to be delivered straight away.

The goal is to find a way of producing the software that’s needed at an affordable cost with high enough quality that it stays useful over time. This is where Behavior-driven development (BDD) and automated testing and quality practices come in.

Go Fast, Start Slow

One challenge to achieving quality at speed is that people want to dive into a project without knowing what they need to do. Discovery, the first practice of BDD, helps you work in a more effective way and focus on the most important aspects of your project. Discovery ensures that we don’t start doing something and then say, “Oh, I didn’t think about that,” or, “I misunderstood what I was being asked to do, so I need to throw away what I just did and start again.”

Discovery builds on the Agile techniques of deferring detailed requirements planning until the last responsible moment. Essentially, Discovery lets us slice our user stories into the smallest practical increments, and then study those in detail to figure out how much work each of them requires. We then prioritize, which means we might only work on a few of them.

Discovery Accelerates Quality

Someone might object to this process because if you take the time to break everything down before you start, that doesn’t seem speedy enough. But in reality, you work in a far more efficient manner after completing these steps. You begin the project by cutting out the stuff you don’t need to do, before you waste any time doing it. By prioritizing the most important features and reaching a shared understanding of them, you maximize the amount of work you don’t need to spend time on. 

So, you do less work. More importantly, you do the right work.

Customers often come to BDD because they’re looking to automate their testing so they can release more quickly and with higher quality. But there’s no return on investment. In fact, there’s a cost, because that level of automation is time-consuming to write, hard to debug, and costly to maintain. 

With BDD, on the other hand, you start with Discovery, which means you get to a shared understanding. You prioritize just what you need and no more. You formulate it in business language, so it has meaning to everyone who understands the business domain, and the automation gives you the opportunity to increase throughput. By applying BDD within an Agile context, you get efficiency, throughput, and quality.

Achieve Quality in Spite of Risk

When delivering software at speed it’s not enough to develop the required software. You need the confidence that the quality level is compatible with your risk appetite. This is where the secondary output of BDD comes in: the automated tests.

Different businesses have different risk profiles. It’s not a disaster if a pizza order goes missing, but if your business is subject to governmental health regulations, getting a small thing wrong means people might die. We need to understand our risk profile and make sure we’ve got processes in place to ensure the software we deliver matches that risk profile. Automated testing can be an important part of that. BDD gives you automated acceptance tests that verify the software behaves correctly and delivers the functionality required by the business.

Start With Documentation

Everyone who’s ever put together a piece of flat pack furniture, or bought electronic goods off the internet, knows that the instructions often don’t appear to relate to the device that you’ve been shipped. Anyone who works in software has dealt with documentation that is clearly incorrect. It may have been correct once, but it’s not correct anymore. 

With BDD, because you’re specifying requirements using business language, those specifications are the documentation. There are tools that automate that documentation, so you immediately see when the documentation and the system implementation diverge. They may diverge because someone’s introduced a defect, or they may have diverged because the documentation is out of date. Whatever the reason, you’re automatically notified when they diverge. Then you can act, rather than needing to proactively schedule time every week, or every release, to review the documentation and work out whether it’s still correct. 

End with the Language that Everyone Uses

Industries that deal with external regulators can particularly benefit from using BDD, which writes the specifications in business language. Those specifications, directly automated, verify the software is behaving as expected. Running these specifications can be helpful to non-technical people. Because it’s written in business language, people can see which scenarios are being checked and the outcome of those scenarios. 

The regulatory authorities are thrilled to get this in business language, so they don’t have to go through hideous spreadsheets. There’s potential time and cost savings for customers who adopt business language and tools that support BDD.

The post How transformation works in practice appeared first on SD Times.

]]>
The bumpy road of moving applications to the cloud https://sdtimes.com/cloud/the-bumpy-road-of-moving-applications-to-the-cloud/ Mon, 26 Jul 2021 15:30:40 +0000 https://sdtimes.com/?p=44806 Everywhere you look, companies are involved in some sort of digital transformation. For some, it means moving their entire business to the cloud which may include building or purchasing software for payroll, ordering, fulfillment and many other activities. For larger firms, entrenched with legacy applications, it means migrating or rewriting hundreds or perhaps thousands of … continue reading

The post The bumpy road of moving applications to the cloud appeared first on SD Times.

]]>
Everywhere you look, companies are involved in some sort of digital transformation. For some, it means moving their entire business to the cloud which may include building or purchasing software for payroll, ordering, fulfillment and many other activities. For larger firms, entrenched with legacy applications, it means migrating or rewriting hundreds or perhaps thousands of applications to the cloud.

This is no small feat, given the complexity and brittleness of many legacy applications. These applications have existed in some companies for decades and have been duct taped and “band aided” along the way to keep them working and stable. Developers have bolted on upgrade after upgrade, adding 3rd party security features, web and mobile capabilities and so forth. 

A digital transformation for many larger companies is an ongoing process, taking years to evaluate and move through the myriad complex applications, many mission critical and many very fragile. There’s a lot to consider as companies transform their business to the cloud. As organizations evaluate which apps to move to the cloud first, they must look at identity management, database architectures, cloud compatibility, and much more. 

Some organizations opt to place an application on the cloud through a lift and shift approach without doing any redesign of the application. This approach does not allow the application to take advantage of cloud benefits including performance, stability and possible cost savings. This was a common approach in the early days of cloud migrations. A quick fix was to simply move an application to the cloud with all its databases intact. The movement also failed because non-cloud native apps are generally more expensive on a software-defined infrastructure.

However, most companies realize that many applications will need a complete rewrite to take advantage of the cloud’s performance and scalability benefits and to provide a much higher level of security for consumers of the application. As painful and costly as it is to rewrite an application, sometimes there is no other choice. Thankfully, cloud providers offer many features that make building an application much easier than before such as a wide variety of databases, security controls, on demand bandwidth, to name just a few. Regardless of path you take in your application transition journey to the cloud, it’s important to keep in mind that application testing is crucial.

Testing never goes out of style

Some may think that since an application is being migrated or completely rewritten for the cloud, that there isn’t much need to do extensive testing. Afterall, the cloud is going to solve our scalability and performance challenges, right? However, ongoing testing is critical in either approach you take. Even smaller micro releases, associated with agile practices, don’t remove testing/quality concerns.

When an application is being migrated to the cloud, there are many areas that need to be checked after the migration to ensure the application is operating properly. Even smaller micro releases, associated with agile practices, don’t remove testing/quality concerns.

These areas include, but are not limited to, load testing to measure performance and response time and UI testing to ensure the application is still performing as intended for the consumer. There are simply too many unforeseen glitches of migrating an application to the cloud that cannot be discovered without proper testing. 

Comprehensive testing is also crucial to the success of applications being re written (or refactored) for the cloud and include the two previously mentioned factors along with others discussed below. 

 

  • Specific Test Data

 

All applications require test data (credit card numbers, addresses, names, etc.) during testing. This data can either be gathered through masking of production data or by generating synthetic data from scratch. Where do you get this data and how much time will it take to build the right data? Data, or lack of, can be a real drag on building applications for the cloud. 

 

  • Unique Test Environments

 

Testing should simulate the most real world conditions as possible. This can get complex when the application you’re testing needs access to backend databases, mainframe systems, 3rd party API’s, etc. Simulating responses from these systems (known as service virtualization) allows testers to receive a “virtual” response from these systems as if it were the real thing. This approach saves testers and developers countless hours and costs associated with accessing live systems. 

You’ve convinced me I still need to test my cloud apps, now what?

Chances are you and your team are already using a grab bag of testing tools. Some are open-source, some are purchased and some are probably home grown. But many companies are now evaluating the benefits of using an open-source, continuous testing framework that can span the DevOps toolchain.

There is a need for a more automated approach to testing where both testers and developers can be a part of building quality applications, faster. Perhaps you are part of a Center of Excellence and are looking to standardize on a particular test platform that meets the needs of both teams. A testing platform can bring you this type of democratization.

There is also a need to shift testing to the left so that testing starts much earlier in the development process. Without a test platform, this type of frequent testing is more difficult because monitoring the various checkpoints cannot be easily automated. Many single purpose testing tools were not architected with digital transformations to the cloud in mind. They serve a single function and probably do it well, but do not provide the ability to collaborate.

When viewed as a whole, a comprehensive testing platform will have the following elements where each function (or more) can be handled by individual teams. The beauty of this approach is that critical aspects of application testing will be included in your sprints, allowing you to shift testing to the left. 

Make your journey to the cloud easier 

If your testing tools look more like a cluttered grab bag of expensive single-purpose gizmos, then it’s probably time to consider how a testing platform can be a better solution as you move your legacy applications to the cloud. Using a cloud-based testing platform to test cloud applications just makes sense regardless of how you are structuring your digital transformation.

Content provided by SD Times and Broadcom.

The post The bumpy road of moving applications to the cloud appeared first on SD Times.

]]>