Forrester Archives - SD Times https://sdtimes.com/tag/forrester/ Software Development News Mon, 06 Feb 2023 15:51:22 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://sdtimes.com/wp-content/uploads/2019/06/bnGl7Am3_400x400-50x50.jpeg Forrester Archives - SD Times https://sdtimes.com/tag/forrester/ 32 32 Software intelligence is key to creating better applications https://sdtimes.com/software-development/software-intelligence-is-key-to-creating-better-applications/ Mon, 06 Feb 2023 15:51:22 +0000 https://sdtimes.com/?p=50247 Development teams are always on a mission to create better quality software, be more efficient, and please their users as much as possible. The introduction of AI into the development pipeline makes this possible, from software intelligence to AI-assisted development tools. Both can work hand in hand to reach the same goal, but there’s a … continue reading

The post Software intelligence is key to creating better applications appeared first on SD Times.

]]>
Development teams are always on a mission to create better quality software, be more efficient, and please their users as much as possible.

The introduction of AI into the development pipeline makes this possible, from software intelligence to AI-assisted development tools. Both can work hand in hand to reach the same goal, but there’s a difference between software intelligence and intelligent software.

AI-assisted development tools are products that use AI to do things like suggest code, automate documentation, or generally increase productivity. Vincent Delaroche, founder and CEO of CAST, defines software intelligence as tools that analyze code to give you visibility into it so you can understand how the individual components work together, identify bugs or vulnerabilities, and gain visibility. 

So while these intelligent software tools help you write better code, the software intelligence tools sift through that code and make sure it is as high quality as possible, and make recommendations on how to get to that point. 

“Custom software is seen as a big complex black box that very few people understand clearly,  including the subject matter experts of a given system,” said Delaroche. “When you have tens of millions of lines of code, which represent tens of thousands of individual components which all interact between each other, there is no one on the planet who can claim to be able to understand and be able to control everything in such a complex piece of technology.”

Similarly, even the smartest developer doesn’t know every possible option available to them when writing code. That’s where AI-assisted development comes in, because these tools can suggest the best possible piece of code for the application. 

For example, a developer could provide a piece of code to ChatGPT and ask it for better ways of writing the code. 

According to Diego Lo Giudice, principal analyst at Forrester, Amazon DevOps Guru serves a similar purpose on the configuration side. It uses AI to detect possible operational issues and can be used to configure your pipelines better.

Lo Giudice explained that quality issues aren’t always the result of bad code; sometimes the systems around the software are not configured correctly and that can result in issues too, and these tools can help identify those problem configurations. 

George Apostolopoulos, head of analytics at Endor Labs, further explained the capabilities of software intelligence tools as being able to perform simple rules checks, provide counts and basic statistics like averages, and do more complex statistical analysis such as distributions, outliers and anomalies. 

Software intelligence is crucial if you’re working with dependencies

Software intelligence plays a big role not only in quality, but in security as well, solving a number of challenges with open source software (OSS) dependency. 

These tools can help by evaluating security practices of development, code of the dependency for vulnerable code, and code of the dependency for malicious code. They use global data to identify things like typosquatting and dependency confusion attacks.

According to Apostolopoulos, there are a number of things that can go amiss when adding in new dependencies, updating old ones, or just changing code around. 

“In the last few years a number of attacks exposed the potential of the software supply chain for being a very effective attack vector with tremendous force multiplying effects,” said Apostolopoulos. “As a result, a new problem is to ensure that a dependency we want to introduce is not malicious, or a new version of an existing dependency does not become malicious (because its code or maintainer were compromised) or the developer does not fall victim to attacks targeting the development process like typosquatting or dependency confusion.”

When introducing new dependencies, there are a number of questions the developer needs to answer, such as which piece of code will actually solve their problem, as a start. Software intelligence tools come into play here by recommending candidates based on a number of criteria, such as popularity, activity, amount of support, and history of vulnerabilities.

Then, to actually introduce this code, more questions pop up. “The dependency tree of a modestly complex piece of software will be very large,” Apostolopoulos noted. “Developers need to answer questions like: do I depend on a particular dependency? What is the potentially long chain of transitive dependencies that brings it in? In how many places in my code do I need it?” 

It is also possible in large codebases to be left with unused and out-of-date dependencies as code changes. “In a large codebase these are hard to find by reviewing the code, but after constructing an accurate and up to date dependency graph and call graph these can be automatically identified,” Apostolopoulos said. “Some developers may be comfortable with tools automatically generating pull requests that recommend changes to their code to fix issues and in this case, software intelligence can automatically create pull requests with the proposed actions.” 

Having a tool that automatically provides you with this visibility can really reduce the mental effort required by developers to maintain their software. 

The software landscape is a “huge mess”

Delaroche said that many CIOs and CTOs may not be willing to publicly admit this, but the portfolio of software assets that run the world, that exist in the largest corporations, are becoming a huge mess. 

“It’s becoming less and less easy to control and to master and to manage and to evolve on,” said Delaroche. “Lots of CIOs and CTOs are overwhelmed by software complexity.”

In 2011, Marc Andressen famously claimed that “software is eating the world.” Delaroche said this is more true than ever as software is becoming more and more complex. 

He brought up the recent example of Southwest Airlines. Over the holidays, the airline canceled over 2,500 flights, which was about 61% of its planned flights. The blame for this was placed on a number of issues: winter storms, staffing shortages, and outdated technology.

The airline’s chief operating officer Andrew Watterson said in a call with employees: “The process of matching up those crew members with the aircraft could not be handled by our technology … As a result, we had to ask our crew schedulers to do this manually, and it’s extraordinarily difficult … They would make great progress, and then some other disruption would happen, and it would unravel their work. So, we spent multiple days where we kind of got close to finishing the problem, and then it had to be reset.”

While something as disruptive as this may not happen every day, Delaroche said that every day companies are facing major crises. It’s just that the ones we know about are the ones that are big enough to make it into the press. 

“Once in a while we see a big business depending on software that fails,” he said. “I think that in five to ten years, this will be the case on a weekly basis.”

Another area to apply shift-left to

Over the last years several elements of the software development process have shifted left. Galael Zino, founder and chief executive of NetFoundry, thinks that software analysis also needs to shift left. 

This might sound counterintuitive. How can you analyze code that doesn’t exist yet? But Zino shared three changes that developers can make to make this shift.

First, they should adopt a secure-by-design mentality. He recommends minimizing reliance on third-party libraries because often they contain much more than the specific use case you need. For the ones you do need, it’s important to do a thorough review of that code and its dependencies.

Second, developers should add more instrumentation than they think they will need because it’s easier to add instrumentation for analysis at the start than when something is already in production. 

Third, take steps to minimize the attack surface. The internet is the largest single surface area, so reduce risk by ensuring that your software only communicates with authorized users, devices, and servers. 

“Those entities still leverage Internet access, but they can’t access your app without cryptographically validated identity, authentication and authorization,” he said. 

What does the future hold for these tools?

Over the past six months Lo Giudice has seen a big acceleration in adoption of tools that use large language models. 

However, he doesn’t expect everyone to be writing all their code using ChatGPT just yet. There are a lot of things that need to be in place before a company can really bring all this into their software development pipeline. 

Companies will need to start scaling these things up, define best practices, and define the guardrails that need to be put in place. Lo Giudice believes we are still about three to five years away from that happening. 

Another thing that the industry will have to grapple with as these tools come into more widespread use is the idea of proper attribution and copyright. 

In November 2022, there was a class-action lawsuit brought against GitHub Copilot, led by programmer and lawyer Matthew Butterick. 

The argument made in the suit is that GitHub violated open-source licenses by training Copilot on GitHub repositories. Eleven open-source licenses, including MIT, GPL, and Apache, require the creator’s name and copyright to be attributed. 

In addition to violating copyright, Butterick wrote that GitHub violated its own terms of service, DMCA 1202, and the California Consumer Privacy Act. 

“This is the first step in what will be a long jour­ney,” Butterick wrote on the webpage for the lawsuit. “As far as we know, this is the first class-action case in the US chal­leng­ing the train­ing and out­put of AI sys­tems. It will not be the last. AI sys­tems are not exempt from the law. Those who cre­ate and oper­ate these sys­tems must remain account­able. If com­pa­nies like Microsoft, GitHub, and OpenAI choose to dis­re­gard the law, they should not expect that we the pub­lic will sit still. AI needs to be fair & eth­i­cal for every­one. If it’s not, then it can never achieve its vaunted aims of ele­vat­ing human­ity. It will just become another way for the priv­i­leged few to profit from the work of the many.”

The post Software intelligence is key to creating better applications appeared first on SD Times.

]]>
Value stream management provides predictability in unpredictable times https://sdtimes.com/valuestream/value-stream-management-provides-predictability-in-unpredictable-times/ Thu, 05 Jan 2023 22:04:23 +0000 https://sdtimes.com/?p=49977 In 2019, most business leaders probably wouldn’t have predicted the changes that would be coming their way in early 2020 thanks to a global pandemic. If they had, perhaps they would have been able to make decisions more proactively and wouldn’t have had to scramble to convert their workforce to remote, digitize all their experiences, … continue reading

The post Value stream management provides predictability in unpredictable times appeared first on SD Times.

]]>
In 2019, most business leaders probably wouldn’t have predicted the changes that would be coming their way in early 2020 thanks to a global pandemic. If they had, perhaps they would have been able to make decisions more proactively and wouldn’t have had to scramble to convert their workforce to remote, digitize all their experiences, and deal with an economic downturn. 

Now, the country is in another period of uncertainty. You’ve read the headlines all year: The Great Resignation, layoffs, a possible recession, Elon Musk’s takeover of Twitter shaking up marketing spending, introductions of things like GitHub Copilot and ChatGPT having workers worrying about their future job security, and more. The list could go on and on, but one thing that would help people through these times is knowing they’ll make it out okay on the other end. 

Unfortunately that level of predictability isn’t always possible in the real world, but in the business world, value stream management can help you with it.

According to Lance Knight, president and COO of ConnectALL, the information you can get from value stream management can help you with predictability. This includes things like understanding how information flows and how you get work done. 

“You can’t really be predictable until you understand how things are getting done,” said Knight. 

He also claimed that predictability is a more important outcome of value stream management than the actual delivery of value, simply because of the fact that “you can’t deliver value unless you have a predictable system.” 

Derek Holt, general manager of Intelligent DevOps at Digital.ai, agreed, adding “If we can democratize the data internally, we can not only get a better view, but we can start to use things like machine learning to predict the future. Like, how do we not just show flow metrics, but how do we find areas for flow acceleration? Not just what are our quality metrics, but how do we drive quality improvement? A big one we’re seeing right now is predicting risk and changing risk. How do you predict that before it happens?”

Knight also said that a value stream is only as effective as the information that you feed into it, so you really need to amplify feedback loops, remove non-value-added activities and add automation. Then once your value stream is optimized, you can realize the benefit of predictability. 

If you’ve already been working with value streams for a while then it may be time to make sure all those pieces are running smoothly and look for areas where there is waste that can be removed. 

Knight also explained the importance of embracing the “holistic part” in value stream management. What he means by this is not just thinking about metrics, but thinking about how you can train people to understand Lean principles so that they can understand how the way they develop software will meet their digital transformation needs. 

Challenges companies face 

Of course, all that is easier said than done. There are still challenges that companies face after adopting value stream management to actually get to the maturity level where they gain that predictability. 

One issue is that there is confusion in the market caused by vendors about what value stream management actually is. “Some people think value stream management is the automation of your DevOps pipeline. Some people think value stream management is the metrics that I get. And there’s confusion between value management and value stream management,” said Knight. 

Knight wants us to remember that value stream management isn’t anything new; It can trace its origins back to Lean Manufacturing created by Toyota in the 1950s in Japan.  

And ultimately, value is just the delivery of goods and services. Putting any other definition on it is just the industry being confused, Knight believes. 

“So people who are trying to implement value streams are getting mixed messages, and that’s the number one challenge with value stream management,” said Knight.

Digital.ai’s Holt explained that another challenge, especially for those just getting started, is getting overwhelmed. 

“Don’t be paralyzed by how big it seems,” said Holt. He recommends companies have early conversations acknowledging that they might get things wrong, and just get started. 

Where has value stream been? Where is it headed? 

In our last Buyer’s Guide on value stream management, the theme was that it aligns business and IT. 

Holt has seen in the past year that companies are adopting mentalities that are less about that alignment. Now the focus is that software is the business and the business is software. 

In this new mentality, metrics have become crucial, so it’s important to have a value stream management system in place that actually enables you to track certain metrics. 

“Things like OKRs continued to kind of explode as a simple means to drive better outcome-based alignment … simple KPIs around objective-based development efforts or outcome-based development efforts,” said Holt. 

Holt also noted that in Digital.ai’s recently published 16th annual State of Agile report, around 40% of respondents had adopted one of these approaches, and that was significantly up from the previous year. 

He went on to explain that companies investing in value stream management want to be sure that their investments are actually paying off, especially in the current economic climate.

He also said value streams can help organizations make small, evolutionary improvements, rather than one big revolution. 

“Value stream management is building on some of the core transformations that happened before,” said Holt. “Wiithout the Agile transformation, there would have been no DevOps, and without Agile and DevOps, there probably wouldn’t be an ability to talk about value stream management.”

So value stream management will continue to build on the successes of the past, while also layering in new trends like low code, explained Holt. 

What sets successful value stream management practices apart

Chris Condo, principal analyst at Forrester, last month wrote a blog post where he laid out the three qualities that set successful value stream management practitioners apart. 

  1. Use of AI/ML to predict end dates. According to Condo, development teams with access to predictive capabilities are able to use them to create timelines that are more likely to be met. He noted that the successful teams don’t replace estimates produced by people on their team, but rather augment those estimates with machine estimation. 
  2. Bottleneck analysis. Teams can use value stream management to discover what the real cause of their bottlenecks is. “When it comes to VSM, too many clients put the cart before the horse, thinking that they need a high-performing DevOps culture and tool chain to effectively use VSM. None of this could be further from the truth,” said Condo.
  3. Strong metrics and KPIs. Development leaders want these metrics if they are going to be putting money into value stream management, so look for vendors that can provide excellent metrics. 

 

The post Value stream management provides predictability in unpredictable times appeared first on SD Times.

]]>
SD Times Q&A: Five things to look for in 2023 https://sdtimes.com/software-development/sd-times-qa-five-things-to-look-for-in-2023/ Thu, 22 Dec 2022 16:00:58 +0000 https://sdtimes.com/?p=49906 This time of year, organizations take stock of the year that’s ending and then strategize about what they want to accomplish in the year ahead. Forrester research director for software development Chris Gardner spoke with SD Times editor-in-chief David Rubinstein to talk about what they’re seeing for 2023. This is a transcript of that conversation, … continue reading

The post SD Times Q&A: Five things to look for in 2023 appeared first on SD Times.

]]>
This time of year, organizations take stock of the year that’s ending and then strategize about what they want to accomplish in the year ahead.

Forrester research director for software development Chris Gardner spoke with SD Times editor-in-chief David Rubinstein to talk about what they’re seeing for 2023. This is a transcript of that conversation, edited for length and clarity.

SD Times: We’re here to kind of get a sense of what you folks are seeing coming up in 2023 for software development.

Gardner: There’s a couple different things that we’re expecting. We’re predicting five different things happening. The first prediction is that, since the development’s taken off in a huge way, we’re getting a tremendous number of questions around low code and no code, particularly from folks that are building applications for the first time. We are seeing a significant number of traditional developers also use low code, no code, and they’re running into their own challenges.

[Non-traditional developers] within organizations are given the access to the tools they need in order to do that development. And they are traditionally not given too much governance or too many constraints as to what they can use. What ends up happening is we’re seeing a number of folks that are building applications and aren’t really thinking about the security ramifications of them. They don’t really talk with their security team; they don’t really ask questions around application security or secure coding or data sensitivity. And what we’re seeing is the potential for a breach. So our prediction is that there will be a headline security breach coming out of citizen development in 2023. Most likely, it’ll be sensitive data that gets out that’s not supposed to get out. And to try to battle this, security teams need to set proper guardrails and review roles instead of proper governance. Those pieces will prevent them from necessarily having that breach. 

SD Times: Yes, that certainly mirrors what we’ve been reporting. So, what’s the second prediction?

Gardner: The second prediction is, API strategies traditionally have been brought about by IT. And that’s not going to stop. IT is big on building out API’s for connecting things like infrastructure and applications to one another. But what we’re seeing is increased interest in this from business leaders, specifically from folks at the C-level. Over 40% of API strategy is coming from the CEO, as opposed to coming from the CIO, possibly coming from boards of directors saying that they need business agility, they need to be able to manufacture, they need to be able to create connections between manufacturing systems, retail systems, automotive systems, so that ecosystem will be required to be created, or will be requested to be created by the C-level folks that are not necessarily in IT. And we’re expecting 40% of strategies will be led by those folks. So those are the people who are actually going to be working to set up the policies, run API’s, and they’ll be building out the ecosystems involved with them. So they’ll go back to the developers and say, here’s what I’m trying to do and trying to connect in terms of my workforce or in terms of my manufacturing, what I’m trying to build out. But it won’t be a situation where it comes up with these API’s on their own.

SD Times: There’s been a lot of talk about API’s now becoming the most vulnerable attack area for bad actors. 

Gardner: And that’s all the more reason why security needs to be involved in this conversation as well. Whenever we talk about API’s and to API taxonomies, and Forrester, we always bring the security and risk management folks into the conversation because it’s critical for them to own that and to shift that process of making sure the API’s are secured as far left as possible.

SD Times: And the third prediction?

Gardner: The third prediction is around the metaverse. So metaverse isn’t here yet, but everyone thinks it’s going to be here eventually. But there’s a lot of precursors, and there’s a metaverse standards forum that was started up this year that includes members across a wide variety of companies. However, they’re not necessarily in the business of implementing standards, they’re in the business of letting their member organizations come up with standards that the group itself can adopt. So what we predict is there’s going to be a number of competing API standards for the metaverse next year to connect between different worlds, almost as akin to how hyperlinks work to connect you around the web. But there will not be one standard. 

SD Times: Interesting. All right. And number four.

Gardner: Number four is value stream management. So we’ve found that since we started looking at this space around 2020, that value stream management has started to explode. There’s been a number of platforms and players that have come into fruition that are allowing folks to look at the entire software delivery life cycle from beginning to end and find places to remove bottlenecks and improve flow, and identify areas that would be great at contributing to business value. We’re expecting that 20% of enterprises will purchase a VSM solution in 2023. The enterprises that do adopt it, we expected to see a 50% improvement in release cadence and better alignment, delivering on core business goals. 

But it’s one of those things that up until recently, value stream management has been thought of mostly at the kind of strategy level; it’s not really been thought of at the developer level. And we’re seeing more and more folks adopted as a critical component of the developer life cycle and getting the most value out of it.

SD Times: And finally, number five.

Gardner: The final prediction is around WebAssembly. And WebAssembly is traditionally thought of to be used in web applications, like BBC’s iPlayer. And libraries like TensorFlow use WebAssembly for high speeds in the browser. What we’re going to find is WebAssembly is going to move to the edge in a big way, we’re already starting to see folks leverage WebAssembly to avoid runtime parsing that bogs down JavaScript at the edge. And at the edge, speed matters. Edge computing requires microseconds, not milliseconds. So we’re expecting that 30% of folks will use WebAssembly at the edge as opposed to web components. And of the compilers that are leveraged for it, which Rust is probably the strongest, it’s going to bring non-JavaScript developers to the edge. Rust itself is going to grow about 10%, in part because of its use of the edge. So edge is going to push development of WebAssembly and Rust in a big way in 2023.

The post SD Times Q&A: Five things to look for in 2023 appeared first on SD Times.

]]>
Modern app dev is about more than tools, platforms and languages https://sdtimes.com/softwaredev/modern-app-dev-is-about-more-than-tools-platforms-and-languages/ Wed, 06 Jul 2022 17:59:16 +0000 https://sdtimes.com/?p=48194 Today’s application development is a complex landscape of services, integrations and architectures. In fact, most developers today spend more time writing API calls and finding open-source projects – and maintaining those applications once they’re created – than they do writing code for innovative new features. It looks nothing like “your father’s app dev,” which involved … continue reading

The post Modern app dev is about more than tools, platforms and languages appeared first on SD Times.

]]>
Today’s application development is a complex landscape of services, integrations and architectures. In fact, most developers today spend more time writing API calls and finding open-source projects – and maintaining those applications once they’re created – than they do writing code for innovative new features.

It looks nothing like “your father’s app dev,” which involved a code editor, compiler, and few other tools. In today’s world, we see developers struggling under the weight of an ever-expanding toolbox now required to bring products to life.

According to Andrew Manby, VP of Product Management of HCL Volt MX, among the drivers behind modern development are the needs of business to satisfy customers, and overcoming the effects of the COVID-19 pandemic to be able to continue to deliver fixes and new features at speed.

“We did a survey late last year with Forrester, and in our survey, 78% of respondents said they’re prioritizing improving the ability to innovate and really reach their customers,” Manby said. And for businesses to survive the pandemic, businesses had to rely on that old Yankee spirit and ingenuity, he said. “I think businesses could make do or innovate. Almost like having their own Apollo 13 moment, to fix the problem, to be able to continue to reach the customer, adding buy online, pick up in store, things like that. It was that sort of duct tape and air filter moment, for a lot of organizations.”

Piecing together tools for collaboration, development and deployment to a remote workforce has been made a lot easier with cloud computing – no more creating VPNs, unless organizations have specific regulations or security needs they must follow. However, the cloud doesn’t really help address issues such as culture change and the move towards delivering products instead of projects.

Agile and culture change

Agile development is one of those areas where scaling up has been a thorny issue for many organizations. Agile, according to Forrester vice president and analyst Diego Lo Giudice, is not “just a bunch of practices.” Some think going to Scrum training and bringing what you know back to the organization will have everyone working in an Agile way. But Lo Giudice said shifts to Agile and other methodologies require a cultural and behavioral change. “Think about your IT that has been owning the projects, and now suddenly they say we’re going to move to products and you’re going to have a product owner from the business side. And he or she is going to tell you what are the most important things you need to implement. It’s kind of losing power for project managers that used to manage these … projects.”

Another issue Lo Giudice pointed out is integrating all of it throughout the organization. “Everybody thinks SAFe is saving the world. I get lots of clients who tell me, ‘we’re replacing the old bureaucracy with a new type of bureaucracy here.’ Cultural and behavioral change is really tough for organizations.”

Further, he said, these product owners from the business don’t have the skills to think in terms of how project managers in modern development think about minimum viable features and minimum viable products. “They still think in terms of big releases,” he said. Also, he added, business-side product owners “are not even committed to Agile. It’s like, ‘We want to do Agile, but you do it, I’m not going to get involved.’ But that’s not the way agile works.”

But because of this drive to modern application development, organizations are starting to think seriously about what agility, responsiveness and velocity really mean to them. “It comes down to the business problem,” HCL’s Manby said. “I think CIOs are still faced with the same thing – at the end of the day, they still need to modernize their application inventory, they need to move to the cloud because they want to obfuscate some of the risks that they have in their data center. And they want to move that off to other vendors, they want to make the portfolio of applications more modern.”

Another aspect of modern development to think about has nothing to do with tools or programming languages. It’s the difficulty organizations are having in attracting and retaining developer talent. “People, given this day and age, are more mobile – not in the physical sense, but more willing to swap” one job for another, Manby said. “Developers want to do meaningful work, they want to be in an engaging work environment, and they want to use the cool tools. But they also want to use the stuff they learned in college, or in their experience. But there’s the old guard who know how to do things in a certain way. They’re used to using WebSphere and db2 and Oracle, and Siebel. And the new generation is coming in, and they’re all React and Angular and all container ready and Git friendly. It’s not the culture clash, but the organizations that haven’t shifted are finding it more difficult to get to containers and the cloud. The smarter organizations are bringing in more of the influx of those newer developers and the new-wave IT people to help push that acceleration along, to use those new types of tools.”

A place for low-code tools

With different languages and platforms for creating or importing pieces of code to create modern applications, Manby said “we’ve probably got as much fragmentation now from an application developer standpoint as we’ve ever had.” He went on to say that the rate of change has gotten faster as well. “Angular 1, Angular 2, React, Flutter. It’s almost like there’s a faster inertia,” he said. “And there’s a concern about obsolescence. If you have to look after a piece of code that’s got Dojo in it, when you give that to a new developer, they say, what’s this stuff? That’s a challenge. But at the same time, in its day Dojo was modern and exciting for folks.”

This, Manby believes, is where low code is trying to come from. “The appeal of the platform is, whatever framework you may be using, if we as a vendor do this the right way, then whether it’s Angular or React or whatever, we’re going to insulate you from those sorts of challenges,” he said. “But we’re still going to give you something that’s not going to dumb down the skills that you’ve learned but also allows you to be a superhero, and do some cool stuff without boxing you in.” 

Low code has become a modern de rigeur term, and represents a way to apply rigor to development and deployment, Manby said. “Low code is applied to DevOps pipelines, it’s applied to data integration. You could apply the principles of anything, which gives you a visual model, a model-driven approach. You can say that no code or low code makes [development] go faster, when it comes back down to pure developer productivity.”

When it comes to  professional development, low code is not removing tools, Manby said. “It’s providing pieces to try and make those developers’ lives simple. If you can simplify how you aggregate data across multiple systems, or provide you with an orchestration layer so you can orchestrate a series, a more complex workflow with parallel looping. Do you want your developers to create that from scratch, and then have to maintain it? Or do you want to use a tool to enable you to do that?”

As for testing, Manby said a low-code tool can generate the test case automatically and continually test the applications as they evolve, which saves developers time. “It’s not about removing things,” he said. “It’s just trying to make you more productive.”

MAD about development

The baseline activities of modern application development, as defined by research firm Forrester, are ideate, design, build and deliver. According to an August 2021 report on MAD, Forrester said organizations augment these activities with value stream management, collaborative work management, low code and continuous testing.

The design phase includes developing a prototype, then a minimum viable product. In its report, Forrester notes that experimentation can begin in this phase, using feature management (such as flags) to let developers turn those features on or off as the product makes its way toward full release.

But at the core of all this is business value, and Forrester’s MAD model says that everything developers create must “ultimately be in service of value streams.” Value streams and management of those streams is how organizations can raise their Agile and DevOps practices by gaining insights into the processes used to create and deliver quality software that customers want. Determining what the business wants, and why, should be the first step in the process of creating software products.

Collaborative work management, according to Forrester, “supports the confluence of project and process work by allowing users to create personal and team workspaces,” according to the report, while low code expands development outside of IT.

Meanwhile, continuous testing is required to ensure the accelerated pace of software creation and delivery does not impact the quality of the product.

The post Modern app dev is about more than tools, platforms and languages appeared first on SD Times.

]]>
Report: Fully automated testing remains elusive for organizations https://sdtimes.com/ai-testing/report-fully-automated-testing-remains-elusive-for-organizations/ Thu, 30 Jun 2022 13:22:07 +0000 https://sdtimes.com/?p=48157 Despite the growing complexity of the software that drives organizations, few companies have fully automated testing or are using AI, according to new research conducted by Forrester and commissioned by Keysight.  For the study, Forrester conducted an online survey in December 2021 that involved 406 test operations decision-makers at organizations in North America, EMEA, and … continue reading

The post Report: Fully automated testing remains elusive for organizations appeared first on SD Times.

]]>
Despite the growing complexity of the software that drives organizations, few companies have fully automated testing or are using AI, according to new research conducted by Forrester and commissioned by Keysight. 

For the study, Forrester conducted an online survey in December 2021 that involved 406 test operations decision-makers at organizations in North America, EMEA, and APAC to evaluate current testing capabilities for electronic design and development and to hear their thoughts on investing in automation. It found that only 11% of respondents have fully automated testing. Eighty-four percent of respondents said that the majority of testing involves complex environments. 

Most companies reported that they’re moderately or very satisfied with their testing methods and three-fourths of them use a combination of automated and manual testing. However, 45% of companies say that they’re willing to move to a fully automated testing environment within the next 3 years to increase productivity, gain the ability to simulate product function and performance and shorten the time to market. 

Companies are also looking to add AI for integrating complex test suites, an area of test automation that is severely lacking, with only 16% of companies using it today. 

“Despite their reported high satisfaction levels with their testing methods, companies are interested in moving to more automated approaches and using AI for integrating complex test suites. They understand this will increase their productivity, simulate product function or performance, and shorten design cycles, thereby, reducing product time to market,” the research stated. ‘In turn, this improvement in the testing and development process will yield higher customer satisfaction and increase product sales or revenue. They recognize that reducing time to market can be achieved by better analytics on current test and measurement data, integrated software tools across the product development lifecycle, and an improved ability to share data across teams.”

The post Report: Fully automated testing remains elusive for organizations appeared first on SD Times.

]]>
RPA: Handling mundane tasks, freeing up developers https://sdtimes.com/softwaredev/rpa-handling-mundane-tasks-freeing-up-developers/ Mon, 07 Feb 2022 16:22:53 +0000 https://sdtimes.com/?p=46534 Robotic Process Automation (RPA) has been a useful tool for many organizations. Despite the initial fear that it would grow to take over the jobs of developers, many have come to see that RPA and automation only function well when they work in tandem with developers.  According to Yishai Beeri, growth technologies lead at LinearB, … continue reading

The post RPA: Handling mundane tasks, freeing up developers appeared first on SD Times.

]]>
Robotic Process Automation (RPA) has been a useful tool for many organizations. Despite the initial fear that it would grow to take over the jobs of developers, many have come to see that RPA and automation only function well when they work in tandem with developers. 

According to Yishai Beeri, growth technologies lead at LinearB, the best way for organizations to utilize RPA is to implement it with the purpose of eliminating the mundane tasks that would usually fall to developers. 

He also explained how this technology works to ensure consistency across a development team. Beeri said, “Developers have their own skills, they can automate basically whatever they like if they put their time into it, but sometimes, you want a more organized or central solution for automating these things instead of every developer just scripting away,” he said. “Maybe it’s not important enough for a single developer but if you look at 100 developers…the small time wasters are things that you can automate away with a more centralized solution.” 

Carlos Melendez, COO and co-founder of Wovenware, echoed Beeri’s sentiments by explaining that organizations would much rather have their developers working on tasks that bring value to the company rather than spending the majority of their time on duties that could easily be automated with something like RPA. 

Melendez also explained that when implementing RPA, this is the message that can fight off the employee resistance that may come from the fear of losing their jobs to automation technology. “A lot of the time it’s not about replacing employees, it’s about augmenting their capabilities. So, if half of your time is spent kind of preparing a file or preparing an integration or moving data from one point to another or doing data entry, then you want that person to spend more time on their analysis and verifying what is happening instead of the actual data entry part,” he said. 

RPA still new, and evolving

Jon Knisley, principal consultant, automation, and process excellence at FortressIQ, said that RPA and other automation technologies are still relatively new and, therefore, rapidly evolving. He said, “Among companies that have deployed RPA, a majority have less than 10 bots in production and just 10% have launched more than 100 bots, according to a recent report from Automation Anywhere.” With this, he added that he believes that the full breadth of what RPA and automation in general can do is still undiscovered. 

“Only 11% of business executives surveyed by McKinsey believe their current business models will be economically viable through 2023. Given the potential disruption, organizations continue to invest in complex change programs despite dismal success rates of less than 30%. Automation is the new transformation,” Knisley said. He also noted that RPA has been the fastest growing segment for the enterprise software market for three consecutive years beginning in 2019. “Grand View Research estimates the global market for RPA will surpass $2 billion in 2022 and continue to grow annually at 40%,” he said. 

Arthur Villa, an analyst at Gartner, said that his company’s research has yielded the same results, saying that he has seen no evidence that RPA has been slowing down, even in the midst of newer technology. “[As far as] the state of RPA implementation, I would say that it is still in the relatively early days. If we look at it as a four quarter game we’re probably only in the second quarter… I think that there’s still a lot of adaptations that have been made in the last couple of years,” he said.

With this, Villa pointed out that RPA has only been growing in popularity as larger and more well-known organizations introduce this technology. “A lot of these new vendors are coming into the RPA market and shaking things up. There’s a lot happening within the market especially from the customer and buyer perspective… Many companies start small with RPA and then they rapidly expand those programs so I think we’re still early on in new customers buying RPA and beginning to experiment with the technology,” he said. 

Villa believes that the reason RPA has been so widely accepted and implemented is because it offers organizations simplicity and overall convenience. According to Villa, when compared to other AI solutions, RPA is lower in cost, easy to understand, and companies will usually see a quick return on investment. 

Data, change management challenges

In spite of this, though, it is not uncommon for organizations to face some challenges when introducing RPA into their business processes. According to Melendez, these challenges often fall into two categories: data driven and change management. Melendez explained that the data aspect of these challenges has to do with the quality of the data itself as well as overcoming the different types of roadblocks that arise when you try to automate using bad data. The change management sentric challenges have more to do with employees being worried about what RPA is going to do and how it will change their own jobs. 

When working to remediate these challenges, Melendez said, “Technology is moving so fast that you really need a good set of technology partners that you can trust, that you can go to when you need certain technology solutions. You have your AI partner, your RPA partner, and other partners that will help you navigate the complexities and the changes in those technologies.”

From RPA to IPA (not the pale ale)

The conversation around RPA has shifted slightly in recent years in order to cast a wider net, the newer terminology is Intelligent Process Automation (IPA). The low-code automation company Nintex has been championing it, and according to Terry Simpson, senior solutions engineer at Nintex, “IPA is like the grownup and more mature sibling of RPA. When we say sibling, think about IPA being about 20 years older than its younger sibling RPA, on the maturity scale.”

Simpson continued, “IPA is actually the combination of several technologies coming together to create a very mature and flexible automation capability. Intelligent workflows, natural language processing, machine learning, and even RPA are all integral parts of an IPA solution.” He explained that a key difference between RPA and IPA is that while RPA usually runs on a local machine, IPA is a cloud-based virtual environment. “In simple terms, think about IPA sitting right in the middle of all your applications and performing process automation focused on an entire solution, not just tasks. Tasks may make up a piece of the solution, but IPA brings the entire solution or process together,” he said. 

Brett Greenstein, data and analytics partner at PwC said, “RPA is getting less discussion… because automation has expanded well beyond screen scraping and bots, through the use of APIs, Microservices, and AI/ML. Many companies have adapted to this by expanding the term to IPA to include those newer capabilities as well as process mining.”

Greenstein explained that in the current environment the need for automation is only growing. In the midst of the great resignation and a shortage of skilled developers, automating tasks using a smarter solution is quickly becoming a necessity rather than a luxury. This increased demand for automation has led to the expanding of RPA into IPA in order to introduce fresher technologies into an already reliable method of automation. 

RPA as communication tool

Beeri thinks there is a new role that RPA can fill in the face of a more distributed workforce. He said that RPA can be used as a communication tool that can remind developers of when it is time to take the next step. “A lot of the work that software developers do as a team is a lot of back and forth and communication between people… so coordinating that in an environment where it is mostly asynchronous and we’re not in the same room anymore… automation and smart bots can really help in coordinating this ‘dance’ between people so that people are not interrupted,” he said.

Beeri said that even though this is not a task that has been done in the past or a role that used to be filled by a separate employee, it has become important with the trend we’re seeing towards working remotely. He said, “It really helps to minimize interruptions and maximize speed when working together on things.”

According to Villa, only a few organizations are currently experiencing the full benefits of RPA that Beeri is referring to. He believes that the majority of companies utilizing this technology are the ones that are generating high volumes of revenue, meaning that small and mid-market organizations have yet to adopt automation technology. He said, “There’s still a lot of education that has to happen within mid-market companies that need to understand ‘what is RPA? How can it be used? And how can I get the most bang for my buck?’”  

Knisley also pointed out that education around RPA and automation is essential when trying to implement it in the most effective way possible. However, he also placed an emphasis on the importance of fully understanding and optimizing the company itself before introducing automation. He said, “To achieve the magical future state promised by technology, companies first need to understand their current state. Unfortunately, most companies do not understand how they truly operate especially at a gradual user activity level. To be successful and avoid false starts, companies need to discover, re-engineer and automate — in that order.” 

RPA has fallen out of the spotlight over the past few years

Even with RPA’s rapid growth, it is noteworthy that it has fallen out of the spotlight somewhat in recent years. According to Geenstein this is the result of newer technologies being introduced. He said, “First, there is the screen scraping and click automation that allows RPA to execute the same steps a person would execute in any application. Second, there is the scripting for bots that identifies a sequence of actions with basic logic to decide what action to execute next. As APIs and microservices become more and more available, especially as applications modernize on the cloud, the need for screen scraping and click automation goes away.”

Along the same lines, Beeri said that he feels RPA is not as widely talked about because, at its inception, it was overhyped and now it is failing to live up to all of the original promises. “I think when you start to look at how to deploy [RPA], and what tasks need to be removed, you’re finding that you can change the actual task, you don’t stop at just putting a robot in to automate data entry… The solution for the problem at that point might not be RPA, it might just be automating something using no-code or low-code methods,” he said. 

However, Melendez credits this lack of discussion to something different. He said that rather than RPA being at the center of the discussion, people have shifted to speaking more generally about automation. “RPA as well as AI is becoming so prevalent that the conversation is no longer about deploying RPA, it’s about the solution that we are going to deploy [using RPA],” he said. Melendez explained that because these tools and technologies are so advanced, not only is it assumed that they will be in place, but that it is also assumed that, in most cases, they are going to be able to easily automate whatever is necessary. 

Automation and humans need to work together

On an SD Times-led discussion of RPA on the Discord “Dev Interrupted” server, participants had a lot to say. One of the respondents, Dr. Don Wilcox, talked about a robot used at his organization, which they named Marvin. “We have automation that completes a Task (the most-specific sort of ticket) when a PR associated with that Task is completed,” Wilcox explained. “Then Marvin takes over and changes the state of the parent story based on whether the dev tasks, qa tasks, demo tasks, etc, are complete. Marvin has his own row on the board, and you can get him to perform certain automation tasks (such as adding the standard stories and tasks to a new sprint) just by giving him an appropriately named task in that sprint.” 

This is a good example of the way that automation and human employees have to work together. There’s no doubt that the addition of the robot makes things run more smoothly but the robot cannot function without the direction of the human, which was the overall consensus from the discussion. The idea that RPA or another form of automation will be the end of human labor is far from the truth.

In further support of this, Wilcox said, “Once you build a robot, someone needs to maintain, enhance, QA, replace. That robot assumes it’s own product lifecycle, which will likely require humans. For the foreseeable future, it is going to be humans building the robots, even if the robots help.” 

The post RPA: Handling mundane tasks, freeing up developers appeared first on SD Times.

]]>
Report: Over half of developers feel that current security policies stifle innovation https://sdtimes.com/security/report-over-half-of-developers-feel-that-current-security-policies-stifle-innovation/ Thu, 23 Sep 2021 15:54:18 +0000 https://sdtimes.com/?p=45339 Just over half of developers feel that security policies stifle their innovation and only about a third of developers reported that they are thoroughly educated on the security procedures they are expected to execute, according to a new report by VMware and Forrester.  Forrester conducted a VMware-commissioned survey called “Bridging the Developer and Security Divide” … continue reading

The post Report: Over half of developers feel that current security policies stifle innovation appeared first on SD Times.

]]>
Just over half of developers feel that security policies stifle their innovation and only about a third of developers reported that they are thoroughly educated on the security procedures they are expected to execute, according to a new report by VMware and Forrester. 

Forrester conducted a VMware-commissioned survey called “Bridging the Developer and Security Divide” with 1,475 respondents and five interviews with IT, security, and development managers and above (including CIOs and CISOs) with responsibility for development or security strategy decision-making to explore this topic. 

The survey respondents noted that the top two most challenging tasks are ensuring security in the cloud at 79% and securing workloads and containers at 71%.

“Organizations expect developers to be more involved with security tasks in the future, particularly among cloud and workload tasks. However, developers currently aren’t very involved in security strategy planning or execution,” the report stated. It is important for security and development teams to work together so that development teams are clear which policies to comply with and which tools are approved.

The best way around these bottlenecks is to make sure security is no longer a specialization at an organization and that security tasks should be embedded across people, teams processes, and technologies like in DevSecOps, according to Forrester.

Other methods to improve security include sharing KPIs with developers, automating security to improve scalability, and having the security side learn to speak the language of the development team rather than the other way around. 

“Having a security advocate who asks the right questions and takes the time to get to know the development teams will go a long way to building trust between teams,” the report added. 

The report found that there is room for growth in security education programs since only just over half of developers at 54.3% said there is a formal education process for new and updated security policies within their organizations. 

The biggest challenge will be changing the culture among developer and security teams to foster more collaboration and drive shared outcomes and while some organizations have already made improvements there is still a long way to go, according to Rick McElroy, Principal Cybersecurity Strategist at VMware. 

“Developers are more aware of threats and security risks, but they are challenged by the time to market on many efforts. Tradeoffs happen during the development cycle and, up until very recently, one of the major ones has been security. This has changed given board-level involvement in cybersecurity — we are seeing organizations shifting left on security and delivering solutions that have security built in from development to maintain applications in production over time,” McElroy said.

The post Report: Over half of developers feel that current security policies stifle innovation appeared first on SD Times.

]]>
Forrester: 5 key advances driving AI 2.0 https://sdtimes.com/ai/forrester-5-key-advances-driving-ai-2-0/ Thu, 11 Feb 2021 17:34:18 +0000 https://sdtimes.com/?p=42981 The move AI 2.0 is being driven by five areas of AI advancement, according to a new report from analyst firm Forrester. “Though you’ve likely never heard of them, these AI 2.0 advances are already entering commercial products, and forward-looking enterprises need to start preparing if they want to reap their competitive advantages,” the report … continue reading

The post Forrester: 5 key advances driving AI 2.0 appeared first on SD Times.

]]>
The move AI 2.0 is being driven by five areas of AI advancement, according to a new report from analyst firm Forrester.

“Though you’ve likely never heard of them, these AI 2.0 advances are already entering commercial products, and forward-looking enterprises need to start preparing if they want to reap their competitive advantages,” the report stated.

The first advancement is transformer networks, which are multitasking deep learning models that can be used in problems that have a time or context dimension, such as understanding and generating text or code. It is currently in use in hyperscalers by AWS, Google, IBM, and Microsoft, and by speech and text analytics companies.

The second is synthetic data that is created in simulated virtual environments. This simulated data can be used to create or augment existing training data. Forrester believes synthetic data can be used to accelerate the development of new AI solutions, improve the accuracy of AI models, and protect sensitive data. It is currently being used in autonomous vehicles, financial services, insurance and pharmaceutical firms, and computer vision vendors.

Reinforcement learning is another advancing area of AI that is used to create models that optimize many objectives or constraints, or decide an action based on feedback. It is currently being used by firms dealing with marketing tasks, manufacturing tasks, and robotic learning. 

The fourth area, highlighted by Forrester in the report, is federated learning, which is a process for combining models that are trained on separate data sets. It can be used to share information between devices, systems, and companies to overcome privacy, bandwidth, and computational limits, Forrester explained. Currently, federated learning is being leveraged in hyperscalers, AI application vendors, and consumer electronics companies.

The final new emerging AI area is causal inference, which helps determine cause-and-effect relationships. It can be used to obtain business insights and prevent bias by providing explainability, which Forrester notes can be just as important as prediction accuracy. It is currently being used by innovation teams for use cases such as determining how effective a treatment for a particular disease is. 

“The opportunity to get in on the ground floor of a transformative set of technologies doesn’t come along often. When one does, it is usually inaccessible to all but a select group of specialists. For now, AI 2.0 has leveled the playing field by eliminating many barriers to entry built on years of expertise in AI domains like natural language processing, computer vision, and data advantages painstakingly built over years. Newcomers are outperforming veterans, and startups are building new applications that used to take years or were infeasible. Could you wait and take advantage of AI 2.0 solutions once they are mature? Yes, but you would forgo the opportunity to outperform your industry,” Forrester wrote in the report. 

The post Forrester: 5 key advances driving AI 2.0 appeared first on SD Times.

]]>
premium Report: New approaches to software development will disrupt the status quo https://sdtimes.com/softwaredev/report-new-approaches-to-software-development-will-disrupt-the-status-quo/ Thu, 08 Nov 2018 15:00:25 +0000 https://sdtimes.com/?p=33156 The year 2019 will bring new approaches to increase software development productivity and better align development teams and organizations, according to a recent report by research firm Forrester. Among the new approaches are cloud native, value stream management and artificial intelligence-based tools. “New platforms for cloud-native app architectures, value stream management tools, and infusing artificial … continue reading

The post <span class="sdt-premium">premium</span> Report: New approaches to software development will disrupt the status quo appeared first on SD Times.

]]>
The year 2019 will bring new approaches to increase software development productivity and better align development teams and organizations, according to a recent report by research firm Forrester. Among the new approaches are cloud native, value stream management and artificial intelligence-based tools.

“New platforms for cloud-native app architectures, value stream management tools, and infusing artificial intelligence (AI) into testing are among the breakout technologies we expect in 2019,” the research firm wrote in a 2019 software predictions report.

2018 saw the start and increase in interest in cloud-native technologies and tools. The Cloud Native Computing Foundation released a report in September that found the use of cloud-native technologies in production grew more than 200 percent since the beginning of the year. The CNCF states, “Cloud native technologies empower organizations to build and run scalable applications in modern, dynamic environments such as public, private, and hybrid clouds. Containers, service meshes, microservices, immutable infrastructure, and declarative APIs exemplify this approach.” In addition, the CNCF explains cloud-native systems must be container packaged, dynamically managed and microservices-oriented.

2019 will be the breakout year for cloud-native technologies as cloud vendors start to bring microservices to the masses and blur the lines between container-based and serverless approaches, according to Forrester.

“From a software development perspective, the cloud offers simplicity, velocity, elasticity, collaboration, and rapid innovation that isn’t easily replicated, if at all possible, using traditional on-premises tools. And if you are going to be hosting on the cloud, why not develop in the cloud to make sure your development environment is as close to your run time environment as possible?” said Christopher Condo, senior analyst for Forrester. “Everyone is coming to grips with the fact that some portion of their business will be hosted in the cloud, so they then make the logical leap to say if we’re hosting this particular feature or product in the cloud, let’s go ahead and develop in the cloud as well.”

Value stream management is another term that started gaining traction towards the end of 2018. It refers to an effort to align business goals and visualize the end-to-end development of pipelines. According to Forrester, visual stream management tools are designed to capture, visualize and analyze critical information about speed and quality during the software product creation process. The research firm predicts value stream management will become the new dashboard for DevOps teams in 2019.

“When we started researching these tools in 2017/2018, we weren’t even sure if the term Value Stream Management would resonate. But it has, because the development community has been doing their homework on Lean software development and realizes that in order to improve, you need to measure end-to-end performance over time,” said Condo. “They also recognize that delivering value is why they build software to begin with, so these tools are a natural complement to a DevOps tool chain. Once these tools grow in use, we’ll need to be on the lookout for practitioners being too focused on KPIs and not enough on delivering value to the customer.”

As AI continues to advance, Forrester also sees new AI-based tools emerging in 2019 to provide better insights into development and testing. “Digital acceleration won’t happen without higher-quality software. Despite improved automation with traditional UI and API testing, the field as a whole is still lagging,” Forrester wrote in its report. “Can AI help augment testers and automate testing? You bet; a whopping 37% of developers are already using AI and machine learning (ML) to test better and faster to increase quality.”

DevOps initiatives will still struggle in 2019
Despite Forrester declaring DevOps the new norm, the approach will still face its set of challenges throughout the new year.

“DevOps remains a very active topic of customer inquiry. The questions have shifted, but not declined. Five years ago people were asking ‘what is it,’ and three years ago they were asking “how can I get started?” Now they are asking ‘how do we scale it out? What do we do with ITIL? Can and should we shift entirely to integrated product teams? What about governance, risk, and compliance?’” said Charles Betz, principal analyst at Forrester. “It’s a period of great experimentation and learning, which we anticipate will continue for some years more before best practices start to stabilize.”

Forrester predicts 2019 will be the year of DevOps governance and technology rationalization.

“Issues of risk and governance, including security, change management, and quality control, will take center stage. Impacted by these challenges and increased DevOps vendor competition, enterprises will also emphasize toolchain standardization and more-seamless automation,” the research firm wrote in its 2019 DevOps predictions report provided to SD Times.

As a result of the need for governance, Forrester sees fewer businesses cobbling together their own DevOps toolchains and instead turning to more integrated solutions for improved consistency and compliance. “This will force players from multiple spaces to compete,” Forrester wrote.

In addition, Forrester expects DevOps will experience a major security breach in 2019. “Continuous delivery toolchains are powerful — perhaps too powerful. If an attacker gains access to the toolchain, the entire infrastructure, both upstream and downstream, is at risk,” the firm wrote.

According to Forrester, a major security breach to the DevOps toolchain will result in businesses investing more in governance and risk analytics as well as the adoption in privileged identity management. “It will also prompt more ‘policy as code’ and secure integration between discrete parts of the development and continuous delivery toolchain,” Forrester wrote.

Other DevOps predictions for the next year include: 25 percent of openings for skilled DevOps engineers will go unfilled, mean time-to-resolution rates will increase, and businesses will start to revolt against legacy operational processes.

“In 2019, firms will face increasing complexity and risk as they attempt to scale their DevOps initiatives. This will force them to create more holistic technical, organizational, and governance strategies to address serious capability gaps and better manage risk,” Forrester wrote.

The post <span class="sdt-premium">premium</span> Report: New approaches to software development will disrupt the status quo appeared first on SD Times.

]]>
Report: Developers slow to adopt low-code and no-code solutions https://sdtimes.com/lowcode/report-developers-slow-to-adopt-low-code-and-no-code-solutions/ Fri, 26 Oct 2018 13:18:35 +0000 https://sdtimes.com/?p=32935 Tools for rapid application development have been around for decades, yet despite the hype and benefits surrounding low-code and no-code development, developers are slow to use them, according to a new report. The Evans Data Global Development Survey is conducted every six months and is based on the responses of more than 1,400 developers worldwide. … continue reading

The post Report: Developers slow to adopt low-code and no-code solutions appeared first on SD Times.

]]>
Tools for rapid application development have been around for decades, yet despite the hype and benefits surrounding low-code and no-code development, developers are slow to use them, according to a new report.

The Evans Data Global Development Survey is conducted every six months and is based on the responses of more than 1,400 developers worldwide. It found only one in five developers never use low-code or no-code platforms, and 73 percent of respondents that do use them do so less than half the time. Additionally, 7 percent use the solutions more than 75 percent of the time, and only 2 percent use a low-code/no-code platform exclusively.

“Low-code is another name for the types of declarative visual tools that originated 25 years ago,” Janel Garvin, CEO of Evans Data, wrote in the report. “And while it’s not actually a new concept, it has now re-appeared again in a new re-incarnation, so to speak. This time the dream of citizen developers arising to replace traditional developers has also returned.  If the past is indicative of the future then it’s unlikely to succeed in creating a significant number of untrained amateur developers, and in the meantime the actual developers are not embracing the concept or the tools.”

Other industry reports, however, have stated that low-code development platforms are here to stay. In one, the Forrester Wave: Low-Code Development Platforms for AD&D Pros, Q4 2017, the research organization found application and development delivery pros are moving to low-code platforms, to speed up application and innovation delivery, develop large-scale applications and move to the public cloud. The report was based on 41 interviews with AD&D leaders.

When the report looked at the challenges organizations face, it found difficulty to meet business requirements, lack of flexibility, too long to update apps and high costs among the top problems. When asked how has low-code development platforms addressed the issues, a majority of respondents stated they’ve seen significant improvements to notable improvements.

In order to get developers on board with low-code development platforms, Forrester explained they have to cater to pro developers with controls and deep features and support developer-business collaboration. “Developers adopting low-code platforms want dramatically higher productivity without sacrificing features that allow them to get under the hood if the application they’re building calls for it,” Forrester reported.

Gartner, which refers to low code as high-productivity application as a service (hpaPaaS), found in its Magic Quadrant for Enterprise hpaPaaS that the tools are increasingly expanding their footprint across enterprise IT. “These hpaPaaS solutions enable the enterprise to utilize a full range of developer personas — citizen developers, departmental developers and enterprise IT professionals — and to develop applications that range from tactical to strategic, and stand-alone to integrated,” Gartner wrote. “Such applications will typically be data-oriented, although wider enterprise features such as IoT and event-driven support are becoming more common.”

The post Report: Developers slow to adopt low-code and no-code solutions appeared first on SD Times.

]]>