CX Archives - SD Times https://sdtimes.com/tag/cx/ Software Development News Thu, 04 Feb 2021 16:39:05 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://sdtimes.com/wp-content/uploads/2019/06/bnGl7Am3_400x400-50x50.jpeg CX Archives - SD Times https://sdtimes.com/tag/cx/ 32 32 UI testing a key part of delivering good user experiences https://sdtimes.com/test/ui-testing-a-key-part-of-delivering-good-user-experiences/ Thu, 04 Feb 2021 16:39:05 +0000 https://sdtimes.com/?p=42917 User experience has always been an important factor in the success of an application, but in an increasingly digital-only world, its importance is only increasing.  If all goes perfectly, the user doesn’t think about what’s going on behind the scenes. When a button does what it’s supposed to when it’s clicked on, the user goes … continue reading

The post UI testing a key part of delivering good user experiences appeared first on SD Times.

]]>
User experience has always been an important factor in the success of an application, but in an increasingly digital-only world, its importance is only increasing. 

If all goes perfectly, the user doesn’t think about what’s going on behind the scenes. When a button does what it’s supposed to when it’s clicked on, the user goes about their day, possibly even completing a transaction on the site. But when that button takes several seconds to do anything — or, worst-case scenario, never actually does anything — the user will be frustrated, and perhaps jump to a competing site. 

According to Eran Bachar, senior product management lead at software company Micro Focus, UI testing is the part of the process that tests for usability, which encompasses making sure that customers who are going to use the application understand the different flows. Another element of UI testing, especially in the web and mobile space, is testing user experience. “If you’re waiting more than two seconds for a specific operation to be completed, that can be a problem, definitely. The moment when you click on a button, when you click on an icon, when you click on any element, you should have a very, very fast responsive kind of an action,” Bachar said.

Clinton Sprauve, director of product marketing at testing company Tricentis added: “The goal of UI testing is to understand all aspects from a UI perspective on the business process side, but also on the technical side to make sure there aren’t any things that are wrong from a functional testing point.”

RELATED CONTENT:
How to solve your UI testing problems
There’s more to testing than simply testing

UI testing may be the tip of Mike Cohn’s popular Testing Pyramid — which is a triangle that describes how long to spend doing each phase of testing — but that doesn’t mean UI testing can be ignored and just slapped onto the end of a testing cycle. 

“I believe UI testing is too often an afterthought during product development, and is often placed much too late in the development life cycle,” Jon Edvald, CEO and co-founder of automation platform Garden. “It can be a serious drag on productivity and release cycles to do UI testing — and in fact any type of testing — too late in the process. The later an issue is discovered, the more costly the feedback loop becomes, with all the context switching involved and the terribly slow iteration speeds.” 

Google’s Web Vitals
One measurement that can be used to quantitatively measure user experience is Google’s Web Vitals. According to Guillermo Rauch, CEO of front-end web development company Vercel, these Web Vitals were the first metrics created that focused entirely on user experience. One of the Web Vitals is Largest Contentful Paint (LCP), which measures “how fast the meaningful part of the page took to load,” he explained. Rauch pointed out that everyone has likely visited a website where it looks like everything had loaded, but the content is still being loaded, so images, videos, or sometimes text might show up five seconds after everything else. 

“So this Largest Contentful Paint metric allows us to say how long did it take for us to load what the user is actually interested in? So when you talk about a visual storefront, it’s not just text, it’s also the picture of the coat that you want to buy,” Rauch explained. 

First Input Delay is another Web Vital that measures how long it takes from the user pressing a button, for example, to the site reacting. “If I tap on ‘buy’ is it reacting immediately? We take that for granted, but we’ve all been to websites where we tap and it doesn’t do anything so we kind of intuitively tap again. But a big percentage of users don’t tap again. So they just leave the website. We’re now starting to measure these user experience metrics very diligently,” Rauch said.

Another reason to care about these Web Vitals is that they’re not just used by development teams to measure how happy their users are, but Google can use them to determine search engine rank, Rauch said. In other words, a website that has poor Web Vitals may rank lower, even if they’ve done the proper search engine optimization (SEO) for their site. 

Get key stakeholders involved in the process
Involving product users in UI testing is an important part of the process for companies developing products. At Sorenson, which develops communications products that serve the Deaf community, QA engineering manager Mark Grossinger, who is Deaf, explained that this is an important part of his team’s testing process. His team is constantly reevaluating the needs of its users so that it knows what tools and features to provide, and UI testing is an important step in the process. 

In order to do their UI testing, Sorenson works closely with the Deaf community, as well as other Deaf employees within the company. It’s important for them to do this because as Grossinger describes, “if you have a hearing individual who does not know what a Deaf person needs or how they would use a product, how would they develop it?” 

One example of where that feedback loop made a big contribution was in developing video mail, which Sorenson calls SignMail. “As a hearing person, you can leave a voicemail if your recipient doesn’t answer your call,” Grossinger said. “In the past, there was no equivalent for the Deaf community, so we developed that feature. Now, if a Deaf person is unable to answer a call, the interpreter (or a Deaf caller) can leave a video mail (in American Sign Language), which gives the Deaf person a functionally equivalent message.” 

Another example that Grossinger noted was developing the Group Call feature. He explained that hearing individuals can utilize conference calls if they want to talk with multiple people, so Sorenson used feedback from its customers to create an equivalent feature. 

While Sorenson specifically serves the Deaf community, Grossinger noted that the need to involve the people who would be using the product is key no matter who the user base is. “When you’re developing software, you need to have people within whatever community you’re serving, whether it’s accessibility or something else,” Grossinger said. “You need to get those people to give authentic feedback. If you don’t, you could end up developing software or an app that doesn’t meet the intended user’s needs. So, I think that from the beginning of any project, when you’re at the drawing board and beginning the innovation process, you need to talk about the stakeholders and the features they need.”

According to Grossinger, diversity is the key to successful UI testing. This includes, not just a diverse development team, but diversity among all departments of the company, such as sales and marketing. 

“Sometimes on a smaller team you notice that details can get missed without having that diversity,” said Grossinger. “Diversity also means thinking about various demographics: older populations, younger populations, socioeconomic status, education levels, people who are savvy with technology and those who are not. All those perspectives need to be included in the project development.” 

Manual vs. automated UI testing
While automated testing has been a big buzzword around the industry, it hasn’t quite made its way fully into UI testing yet. 

“It’s unfortunately quite common for most or even all UI testing to be manual, i.e. performed by developers and/or testers, essentially clicking or tapping around a UI, often following predefined procedures that need to be performed any time a change has been made,” said Garden’s Edvald. “This may be okay-ish for small and simple applications but can quickly balloon into a massive problem when the UI gets more complex and has a lot of variations.”

Edvald described his experience witnessing this firsthand when doing development for the menstrual period tracking app Clue. According to Edvald, it was important to the company to have the app available to as many different users as possible, so it is supported on a variety of devices, from “ancient Android phones to the latest iPhones, tons of different screen sizes and operating system versions, many different languages etc.” 

Having all of these different devices, with varying screen sizes and operating systems, led to massive complexity when it came to testing. Manual testing wasn’t going to be possible because a QA team wouldn’t be able to manually test every possible combination at the speed with which the app was being developed and released. To solve the problem, they hired a quality engineer and put more effort into automation. 

“Being able to programmatically run a large number of tests is critical when an application reaches some level of complexity and variation,” said Edvald. “Even for less extreme scenarios, say web applications that have fewer dimensions to test, the cost-benefit quickly starts to tip in favor of greater automation. It’s usually a good idea to also do some level of manual, exploratory testing, but it’s possible to greatly reduce the surface area to cover, and thus the time and resources involved.”

While automating UI testing is the ideal, it’s not always practical for every organization out there. For many organizations, there just isn’t the expertise to do all of the automation, and do it properly. 

According to Tricentis’ Sprauve, most companies still rely on manual testing for that reason. For example, they will have QA testers sitting in front of a computer and manually performing test steps. “One of the issues with UI automation in some instances is what’s known as flaky tests,” Sprauve said. “That could either be because of the type of the tool that you’re using, or it’s because of a lack of skills of how you build that automation. So one of the biggest challenges is how do I build consistent and resilient automation that won’t break when there are changes or if there are changes, how quickly can we recover to make sure that we get those items addressed so that we’re not just spending time making sure our automation is working versus actually testing. So that’s one of the biggest challenges that an organization faces.”

Michael O’Rourke, senior product manager at Micro Focus, noted that most customers average around 30% to 40% automation, and they only have a few customers that are more than 80% automated. “It takes a lot of work to be able to get to that and it involves a lot of different types of transformations that need to be incorporated in there,” said O’Rourke. “Those customers that put more emphasis on automation are generally the ones that are going, but when it comes to different time constraints, some customers find it easier to be able to only automate certain tests, but then still leave some manual processes behind, which is why it generally ends up roughly around 40%.” 

Other than the challenges with complexity and automation, another common challenge for QA teams is maintenance. According to O’Rourke, sometimes a developer may make a change to the UI and then when it gets sent back to the QA person, the test fails. “And they’ll spend a lot of time trying to troubleshoot to figure out what happens because the UI change obviously wasn’t properly documented or told to the QA team. That’s where a lot of the challenges come because they have to go back and modify a lot of the scripts and do a lot of maintenance on it anytime it changes,” he said. 

This is especially challenging early on in product life cycles when a product is being developed, O’Rourke explained. “This could be a very big problem for a lot of those different customers who have to frequently test continuously over and over again where breaks are and then go back and adjust tests,” he added.

Vercel’s Rauch predicts that going forward, the web will just keep becoming more user personalized and user experience will continue being an important part of QA. “The core Web Vitals are here to stay, they’re great for developers, they’re great for business people to find this alignment, but more important than that, it’s not just those core Web Vitals,” Rauch said. “I think measuring user experience especially in ways that are unique to your product and unique to your channels is also going to be very, very important. This is just again, the name implies, these are vital signs. This is just like measuring your heart rate, but also if you want to have a great fitness performance, you gotta measure a lot of other things, not just your heart rate.” 

The post UI testing a key part of delivering good user experiences appeared first on SD Times.

]]>
premium User feedback can be found in unlikely places https://sdtimes.com/softwaredev/user-feedback-can-be-found-in-unlikely-places/ Fri, 07 Aug 2020 16:13:09 +0000 https://sdtimes.com/?p=40914 In 2013, customer experience firm Walker released a report in which it predicted that by 2020, user experience would be the key differentiator for brands, and price and product would become less important to users when choosing among different digital services. Well, 2020 is here, and that prediction seems to have been pretty accurate. In … continue reading

The post <span class="sdt-premium">premium</span> User feedback can be found in unlikely places appeared first on SD Times.

]]>
In 2013, customer experience firm Walker released a report in which it predicted that by 2020, user experience would be the key differentiator for brands, and price and product would become less important to users when choosing among different digital services.

Well, 2020 is here, and that prediction seems to have been pretty accurate. In today’s highly competitive digital economy, offering your customers a product they actually like to use is key. Unless you have a highly specific, one-of-a-kind application, there’s always another — possibly better —  option for your users to switch to if you’re not providing what they need. 

The testing phase of the software development life cycle may help find bugs in an application, but it can’t catch everything. To ensure that your users are actually enjoying the time spent in your application, you need to continually gather feedback from those users. 

“Even a program that is coded perfectly and meets the standards of the stakeholders may not work in the way certain users interact with the application,” said Nikhil Koranne, assistant vice president of operations at software development company Chetu

RELATED CONTENT:
UX design: It takes a village
Mapping customer journeys: The path to better UX

A 2019 survey from Qualtrics revealed that in 2020, 81% of respondents had planned to increase their focus on customer experience. The report also showed that 71% of companies with customer experience initiatives saw positive value from those efforts.

By gathering user feedback, development teams can continually improve and finetune their applications to give their users requested features and fix issues that might not have been discovered during testing. 

There are a number of ways that teams can collect user feedback, some of which might just occur naturally. For example, user reviews or support questions are things that might come in naturally that teams can use as feedback. If a team is getting a lot of support questions about a certain feature because they’re finding it difficult to use, they can utilize that information in determining what needs to be worked on in future releases. “With hundreds of questions a day, we keep a pulse on what people are asking for or where we could make parts of our product easier to understand. We aggregate these support conversations and share common themes to help the product team prioritize,” said Zack Hendlin, VP of Product at OneSignal.

Hendlin also said that his team collects feedback in other forms, such as data analysis, user research sessions, and conversations with customers.

The team analyzes user data, such as where users start an action and where they drop off. “Looking at points where there are big dropoffs in integrating us into their site, viewing delivery statistics, upgrading to a paid plan for more features, and the like allow us to optimize those parts of the user journey,” he said. Hendlin added that useful tools for this type of user data analysis are browse maps and tools, such as HotJar and Google Analytics. 

Itamar Blauer, an SEO consultant, also said he found Hotjar to be a helpful tool to track user behavior across sites. “The best ways that I found to monitor user experience were through heatmap analyses. I would use tools such as Hotjar to track user behavior across a website, identifying the portions of content they were most attracted to, as well as seeing at which part of the page they would exit.”

User research sessions are sessions in which a select number of users get access to an early version of a new release. According to Hendlin, this process can help answer the following questions: “Is the way we are planning to solve the problem actually solving the problem for them? Is it easy to use without needing explanation? Are there needs or desires they have that we haven’t thought about?”

User research sessions are also referred to as user acceptance testing (UAT), which often occurs in the last phase of the development cycle, explained Chetu’s Koranne.

According to Koranne, UAT is typically handled by the project management team. The team is responsible for setting up the parameters of the testing environment, such as the testing group and script of commands. This team then delivers the results of testing back to the developers. Koranne recommends that beta release participants be selected carefully and thoughtfully. 

“The ideal testing group that project managers are looking for would consist of third-party, real-world users with relevant experience,” said Koranne. “These types of users will be able to maneuver through the programs without any preconceived notions of how the process should work, and approach each action the same way other end-users would operate. Stakeholder testing is important as well, as you want to make sure that the program is running as it was originally proposed, but the real value comes from end-users that the application is being built for. When it comes to what kind of end-users are preferred over others, project managers want those with industry experience in the function the application is being developed for, rather than a completely random sample. However, users from a diverse set of company backgrounds are preferable to ensure that the program is accounting for operational use from a multitude of end-users.”

The final way that Hendlin’s team at OneSignal gathers feedback is by having actual conversations with their customers. By engaging with customers, product teams may learn where there are disconnects between users and the products, and what they can do to fix those.  

“Really understanding users comes from talking to them, observing how they interact with the product, analyzing where they were trying to do something but had a hard time, and seeing where they need to consult documentation or ask our support team,” said Hendlin. “There was a supreme court justice, Louis Brandeis who said ‘There is no such thing as great writing, only great rewriting’ and working on building a product and improving it is kind of the same way. As you get user feedback and learn more, you try to ‘re-write’ or update parts of the product to make them better.”

Anna Boyarkina, head of product at online whiteboard tool Miro, said that Miro also gathers feedback from a variety of sources, including support tickets, surveys, customer calls, and social media.

Product integration teams
With information coming in from all of these different sources, it’s important to have a process for handling and sorting through it all. Boyarkina explained that at Miro, there is a product integration team tasked with consolidating all of this feedback and then handing it off to the appropriate team. All of their feedback gets put into a customer feedback lake and then tagged. “For instance if it is a support ticket, it is tagged by a support team,” she said. “If it is from the call of the customer, there is a special form which is submitted by a customer success representative or sales representative, which also contains a tag.”

Koranne believes that feedback needs to be considered from a cross-functional perspective, because many applications in business don’t just affect a single team. For example, an HCM application might be used across HR, Payroll, and IT, so all three of those teams would need to be involved in the process of gathering feedback. “Conversely, the project management/development team would need a cross-functional setup as the feedback given may affect multiple layers of the application,” said Koranne.

According to Koranne, an ideal cross-functional team would consist of a business analyst, tester, and UI/UX developer to address feedback items. 

Prioritizing features
Once the information is with the appropriate team, that team needs to decide what to do with it. At OneSignal, the product team goes through and ranks feature requests on a five factor scale, Hendlin said. Those factors are frequency, value, strategic importance, and engineering effort. 

Frequency is related to how common a request is. For example, if a similar request is coming in hundreds of times, then that feature would be more highly prioritized than a feature that is only causing issues for a handful of users. 

They also look at the impact a feature has on a user. For example, a minor change to the UI would be lowly ranked, while seamless data syncing would rank highly, Hendlin explained.

The next two factors are considerations for the business side of things. The team considers what financial benefit there is to fixing something. In other words, would users be willing to pay for the feature? The team also considers whether a new feature drives new growth opportunities for the company.

Finally, the team looks at how hard the feature would be to build, as well as how much time and effort it would take to build it. 

“Once we weigh these attributes for a feature, we decide what to take on…and just as importantly, what not to,” Hendlin said. 

Gathering and implementing feedback is an ongoing process
The team at OneSignal works in weekly sprints, Hendlin explained. Before each sprint, the team meets and determines whether something that came up through user feedback ranks higher than what they had been planning to work on during that sprint. “We try to avoid major changes midway through a sprint, but we will re-prioritize as we learn more from our customers,” said Hendlin. 

Boyarkina’s team also prioritizes the information gathered from customer feedback. She explained that some feedback requires immediate attention, and that if it is a critical issue they have a 24 hour SLA for fixing it, so those issues get implemented right away. If it is a feature request, it gets moved into a backlog and discussed. 

The product team at Miro gets together on a biweekly basis and is given a report with user insights. On top of that, it holds monthly user insights meetings where it dives into what users are saying and any trends that are occurring.

When considering whether to implement feature requests, there are a few things Miro teams look at. First, they determine whether a feature aligns with their existing product roadmap. They also look at the frequency of a particular request. “If we see that it is something that appears more frequently and it is something that appears really painful, we are taking it into the next development cycle,” said Boyarkina. 

As soon as the team has a prototype of that feature ready, users who requested that feature are invited to participate in a beta for it, Boyarkina explained. Those users are also informed when the feature is actually released. “If we know who requested a certain feature we usually send a ‘hey we released this, you asked for that’ and it’s usually really pleasant for people,” said Boyarkina.  

Challenges of gathering user feedback
One of the obvious challenges of gathering and interpreting user feedback is being able to consolidate and sort through information coming in from different sources. 

“Even when an organization is able to successfully set up the technological capability, (not to mention the cultural support for), gathering continuous user feedback, it’s another task entirely to smartly parse that information, synthesize the insights, determine course of action, and then execute on them,” said Jen Briselli, SVP, Experience Strategy & Service Design at Mad*Pow, a design consulting company. 

Briselli went on to explain that viewing this as a challenge is a bit of a red herring. “Figuring out the most successful way to procure, interpret, and act on this feedback is less a function of logistics around how, and far more critically a function of internal alignment around why,” said Briselli. 

She believes that companies with the most success around this are the ones in which there is stakeholder buy-in to the idea. “Solving for the logistics of data collection and response, and translation for user requirements and development, all fall more naturally out of the process when leadership has bought in and invested in the outcome. From there, finding the methods that fit existing workflows and building the skill sets necessary for its execution resolve more easily,” she said. 

Mehdi Daoudi, co-founder and CEO of Catchpoint, agrees that a big challenge is the vast amount of data, but he sees this as an opportunity more than a challenge. “I think the fact that we have these different data sources make the data even more accurate because it allows us to not only connect the dots but validate that the dots are even correct,” said Daoudi. “So I think the biggest challenge is the amount of data, but I think there is an opportunity there just because of its richness as well.”

User Sentiment Analysis
The process of gathering user feedback should be tied closely to APM. According to Catchpoint, a lot of APM and network monitoring tools often forget about the “last mile of a digital transaction” — the user.

“Why do you buy expensive monitoring tools if it’s not to catch problems before users are impacted? That’s the whole point,” Mehdi Daoudi, co-founder and CEO of Catchpoint

User sentiment analysis is another element of user monitoring. User sentiment analysis uses machine learning to sort through all of the feedback that is coming in and interprets how a user might be feeling. “Because we are in an outcome economy, in an experience economy, adding the voice of the users on top of your monitoring data is critical,” said Daoudi.

As part of this user sentiment analysis, Catchpoint has a free website called WebSee.com, which collects and analyzes user sentiment data. The goal of WebSee is to enable organizations to better respond to service outages. End users can self-report issues they have on various sites, and that data is aggregated and verified by Catchpoint. 

According to Daoudi, user sentiment is a big part of observability. “People talk about observability but what are we observing? Machines? Are we observing users? It’s actually a combination of both these things and we are all trying to make the internet better for our customers and our employees and so observability needs to take into account multiple things including user sentiment.”

6 best practices for integrating design 
According to a recent study from Limina, only 14% of organizations consider themselves to be Design-Integrated companies. Design-Integrated organizations, according to Limina, are those that are “embedding a human-centered design culture into their organizations to gain both exceptional customer experiences as well as business and financial goals.”

“When the entire organization is focused on the needs of the user and the value their product delivers, better products are created and brought to market,” said Jon Fukuda, co-founder, and principal of Limina. “Strong alignment among cross-functional teams creates higher efficiency, working together toward the common goal of creating higher quality user-centered products, which leads to cost savings and increased revenue.”

According to Limina, there are three major barriers to becoming Design-Integrated: C-level support, human-centered design culture, and alignment of operations and metrics. The company offered up six best practices that companies should follow if they wish to become a Design-Integrated Business:

  • “Embed a human-centered design culture in every corner of the company, starting with the C-suite
  • Establish a common language to drive understanding, mitigate risks, and improve processes
  • Integrate design resources into relevant business functions
  • Capture specific metrics and manage them to bridge organizational divisions and drive business outcomes
  • Create reusable artifacts and repeatable processes
  • Invest in artifacts, then processes, then systems”

The post <span class="sdt-premium">premium</span> User feedback can be found in unlikely places appeared first on SD Times.

]]>
Considering CX is the key to unlocking value from Agile https://sdtimes.com/agile/considering-cx-is-the-key-to-unlocking-value-from-agile/ Wed, 15 Jan 2020 19:54:46 +0000 https://sdtimes.com/?p=38586 The benefits of Agile development have long been touted: faster development cycles, increased innovation, and improved productivity. But Agile isn’t just about developing faster. Taking into account customer experience (CX) is a key part of it, too. As Agile has gained traction across the industry, software has morphed into a living thing that grows and … continue reading

The post Considering CX is the key to unlocking value from Agile appeared first on SD Times.

]]>
The benefits of Agile development have long been touted: faster development cycles, increased innovation, and improved productivity. But Agile isn’t just about developing faster. Taking into account customer experience (CX) is a key part of it, too.

As Agile has gained traction across the industry, software has morphed into a living thing that grows and changes, quickly. According to Appian’s director of software development Carlos Aguayo, a rapid release cycle is one in which development teams can “deliver incremental or complete value to your customers in terms of days, week, or a couple of months.” In longer release cycles — which had for so long been the norm — a customer might have to wait a year or more for a new release.

But faster releases are only a piece of the Agile puzzle. It’s not enough to shorten release cycles — you must be utilizing these shorter release cycles to deliver value to your users. Only then can you unlock the full value of Agile.

RELATED CONTENT:
Mapping customer journeys: The path to better UX
Differentiating CX from traditional software user experience
How to successfully apply DevOps in your CX development

“Every professional software developer or engineer understands that software are living, breathing systems that have to continually evolve, grow and adapt to changes in user demands and requirements as well as other external factors that could impact how the software is designed and built,” said Bryan Osima, software engineer and CEO of Uvietech Software Solutions. “The expectation is never — or should never be — that you build it once and then you’re done.”

Software isn’t built for the sake of being built. It’s built to be consumed, Osima explained. “If the vast majority of your users are clamoring for specific functionality and it makes sense to build it, then you should,” he said.

Progress’ principal developer advocate TJ VanToll explained that developers are obsessed with trends. “Our collective obsession is the reason the Gartner Magic Quadrants and Forrester Waves of the world exist—we want to use technologies on the rise, and abandon technologies that have fallen out of favor. And for good reason. Popular technologies have more learning resources, more support options, and look more attractive on a resume.”

But getting too swept up in trends can create a gap between what developers want to build and what users actually want from an application.

In the 2019 CX Industry Report, UserTesting claimed that customers are the most important trend to watch — not VR, AI, IoT, or whatever other technology buzzword that companies cling to. “No matter what some market research may indicate, the only thing that really matters is what matters to your customers. If you already include human insight as part of your product development lifecycle, then you’re in a great position. You can quickly leverage that insight to make informed decisions about what to build and when—and stay on top of trends in consumer behavior,” the CX report states.

According to Aguayo, a benefit of sticking to a rapid release schedule is that developers can get their code into production faster, rather than having to wait months or years to see the impact on the customer. This means that they will be able to see how users respond to an application’s new features and take that into consideration for the next release cycle.

“If you’re spending a lot of time working on something before a customer really gets to look at it and give feedback, it’s easy to get bogged down in details,” said Alex Nied, senior software engineer at SPR. Development teams might spend an excessive amount of time on a new feature that users might not even want.

By implementing a development loop in which user feedback influences development, teams can cut down on unnecessary development time and focus on the features that users actually want.

Osima added that it’s not possible to meet the needs of every single user with a single application. “The reality is not every user will be pleased and satisfied. The goal should be to ensure that the majority of users have a great experience and level of satisfaction with the software,” he said.

He recommends prioritizing what the most important requirements are for the majority of the user base, and then making sure that the update you’re shipping caters to that. On subsequent releases, you can begin introducing other features or changes.

Best practices for rapid release 
According to Osima, there are some best practices that teams should follow when building rapid release software. These include planning for extensibility, adopting agile and producing rapid iterations, and embracing modularity with code.

Planning for extensibility involves building software that you can either add things to or remove things from without destroying the entire foundation, said Osima.

Adapting agile and rapid iterations allows developers to get new versions to users as soon as possible, enabling teams to see what features are working and what isn’t, and then refactoring appropriately, he said.

Finally, modularity involves breaking features into units that can exist independently. This allows developers to develop, test, and iterate upon certain features without having to work on the entire system at once, Osima explained.

“Developers working in rapid release cycles need to strike this difficult balance between spending time architecting a system that will adapt to changing requirements (focusing on the software’s benefit) and minimizing the time to deliver valuable features to their customers (focusing on the user benefit),” said Asa Schachar, lead developer advocate at Optimizely.

Microservices have become appealing to those teams working on rapid release schedules, for those reasons. “Seasoned developers realize that the best solution right now may not be the best solution in a year or even months from now. Engineers spend a lot of time thinking about how to design a system that scales with new technology or methodologies,” said Schachar. “There are hundreds of different ways to build something, but which way will allow you to best adapt to changing requirements or new technologies in the near future? This is partly why design patterns like microservices have become so appealing because when it comes time to adapt an aspect of the system, it’s only ‘micro’ in size.”

The post Considering CX is the key to unlocking value from Agile appeared first on SD Times.

]]>
How to successfully apply DevOps in your CX development https://sdtimes.com/softwaredev/how-to-successfully-apply-devops-in-your-cx-development/ Fri, 03 May 2019 14:00:06 +0000 https://sdtimes.com/?p=35141 When businesses embrace a “customer first” mentality, they become more reliant on technology than ever before. As these enterprises race to transform themselves into digital companies, the need for constant innovation of customer experience (CX) capabilities through software development comes into focus. For customer experience products and services, the imperative for quality is paramount because … continue reading

The post How to successfully apply DevOps in your CX development appeared first on SD Times.

]]>
When businesses embrace a “customer first” mentality, they become more reliant on technology than ever before. As these enterprises race to transform themselves into digital companies, the need for constant innovation of customer experience (CX) capabilities through software development comes into focus. For customer experience products and services, the imperative for quality is paramount because there are no second chances with customers. Today, customers are empowered in ways never seen before: their switching costs are low and social media stories are powerful ways to amplify their dissatisfaction.

Rapid innovation is a priority, but quality cannot be compromised. Companies need flawless customer experience execution from Day One. To drive digital transformation, innovate CX rapidly and assure quality, many development teams have turned to DevOps. Digital transformation requires a different model of software development. It requires a model of “perpetual evolution,” as McKinsey calls it, with many IT groups pressured to deliver ten significant projects each year, with category leaders far exceeding that.

Agile and DevOps practices enable companies to create small experiments to learn which products and experiences are embraced by customers, learn rapidly from shorter feedback cycles between development and operations, and therefore iterate as fast as the market and their customers are moving. The practices also add agility and resiliency to digital transformation projects. When the DevOps software methodology is applied to CX projects, Cyara’s CX Assurance experts highlight four important considerations:

  1. Quality is imperative for customer experience success

Customers have higher expectations than ever before and their experience feedback travels fast. Research by Customer Contact Week revealed that 54 percent of customers that had a bad customer experience considered switching companies, and 50 percent told friends, family, or coworkers about that issue.

  1. The customer’s perspective is paramount

As CX experts, we want every interaction a customer has with our company to be delightful and memorable. Therefore, the software must be designed with the customer’s end goal in mind, rigorously tested across the different channels, leveraging realistic journeys, and then monitored in real time to identify potential struggles before a customer experiences them. Only then can we be certain that we’re assuring an excellent CX.

  1. A single customer journey involves many complex technologies

Customers demand omnichannel journeys where they can interact with a company’s website, chatbot, live chat, IVR (interactive voice response), live voice agent, email, SMS, or other channels. They expect seamless journeys where each channel understands their unique context and history. The technology infrastructure required to connect siloed channels and pass customer data between channels is extremely complex.

For example, the IVR channel requires not just an IVR voice portal, but also VoiceXML applications, speech recognition, text-to-speech, and IP telephony (and that’s just the voice channel!). Connecting an IVR to another channel often requires a connection to a CRM system, computer telephony integration, an ecommerce application, and others—all in the cloud. Many of these systems are supported by legacy and/or homegrown technologies that can be fragile and difficult to evolve.

  1. The DevOps solution set is different for customer experience

DevOps for CX engenders unique requirements, and so there are purpose-built solutions that address these. Generally speaking, these solutions increase automation and facilitate an Agile approach to CX design and management. CX applications frequently involve voice interfaces, which demands specialized testing and monitoring to support that. And, for complex contact-center software, you may need purpose-built technology to facilitate configuration management.

Anthem puts the customer first in its development projects
So, how does this work in practice? To illustrate this, I recently interviewed Anil Ravula, who heads up development of Anthem’s vast network of customer-service contact centers. With more than 73 million people served by its affiliated companies, including nearly 40 million within its family of health plans, Anthem is one of the nation’s leading health-benefits companies. As Anthem’s customer base has grown, so too has the challenge of ensuring its contact-center operations serve the needs of millions of members and providers nationwide. Their experience in applying DevOps methodologies to CX development is an excellent example of how to align Agile and DevOps with a customer-first approach.

Until recently, Anthem was barely managing to deliver weekly updates of its contact-center system, a massively complex task that typically commenced at 4pm, ran overnight, and involved multiple teams in different locations. This multi-step process included build, integration testing, deployment validation, and final rollout—and was almost entirely manual. Several challenges were associated with this approach:

  1. It precluded the continuous integration/continuous deployment.
  2. It required coordination of different groups across different locations.
  3. Builds did not always incorporate the most important features.
  4. There was no automated regression testing to validate build and deployment.
  5. The lead time to implement new features was measured in months.

In 2017, as part of a companywide adoption of an Agile/DevOps approach, Anthem transitioned contact-center applications development to a DevOps-driven continuous integration/continuous development approach. The overarching objective, from a development perspective, was to automate the build and deployment of the IVR system—the frontline service for Anthem’s interaction with customers. As part of its transition to Agile, Anthem adopted a sprint-based approach with a heightened focus on customer experience.

“To enable faster innovation of our IVR systems, we embraced DevOps concepts and with that came a whole set of new tools,” said Anthem Staff Vice President Anil Ravula. “This was also a major shift of our mindset—we started to think in terms of user stories and to assign developers well-defined initiatives that focused on specific customer outcomes.”

Anthem’s new approach to CX system development automates the entire process and, most importantly enables developers to develop, build, and test small, user-focused improvements before publishing these as deliverables to an enterprise artifact repository server. From there, deployment and testing are also automated. Most importantly, testing is now more rigorous, with broader coverage able to explore a huge number of potential cases and performed on what’s actually been deployed—with automated feedback of errors and other anomalies to the development team.

“Whereas before we would run a series of defined tests manually, we can now use a fully automated approach using Cyara’s comprehensive set of IVR testing protocols,” said Ravula. “We also added a Lighthouse Dashboard to measure our build quality based on real-world testing and to provide visual reinforcement that we’re hitting our quality goals.”

The combination of investing in cultural change and the right technology has yielded the results Anthem was hoping for. Before the transformation, it took five to eight months to implement new features. Anthem can now innovate faster with smaller, more sure-footed steps—and derive meaningful business value from new features almost immediately, with weekly builds and new features deployed twice each month.

And while the benefits to Anthem’s customers are improved systems to get their questions resolved, there’s also been a quantifiable improvement for the development team, says Ravula: “What’s really great is that the team now has more reasons to celebrate their efforts because they see the success of delivering valuable features to our customers within weeks, not months.”

The post How to successfully apply DevOps in your CX development appeared first on SD Times.

]]>
Cyara brings DevOps to customer experience solution https://sdtimes.com/devops/cyara-brings-devops-to-customer-experience-solution/ Thu, 13 Dec 2018 16:17:54 +0000 https://sdtimes.com/?p=33618 Customer experience (CX) assurance platform provider Cyara wants to help businesses improve the speed and quality of their development, testing and production teams with the announcement of new DevOps integration. According to the company, the new features will enable businesses to increase their automation initiatives and provide an Agile approach to CX design and management. … continue reading

The post Cyara brings DevOps to customer experience solution appeared first on SD Times.

]]>
Customer experience (CX) assurance platform provider Cyara wants to help businesses improve the speed and quality of their development, testing and production teams with the announcement of new DevOps integration. According to the company, the new features will enable businesses to increase their automation initiatives and provide an Agile approach to CX design and management.

“With the functionality we’re announcing today, we’re making it easier to put CX at the center of your digital transformation,” said Alok Kulkarni, CEO and co-founder of Cyara. “This is the first step of a much broader strategy to maximize the value of the Cyara Platform through integration with other ecosystem players and tools.”

The company is adding new APIs and integration tools to make automation easier to implement and more accessible to users. Cyara’s design-driven testing component Velocity will now be able to share test cases, campaigns and CS designs with test-management, continuous integration, configuration management and Agile tools, the company explained. “For example, Cyara test cases can now be replicated directly in Micro Focus ALM (Application Lifecycle Management) to provide a single, accurate source for test-case management across the enterprise. Organizations can track and report holistically, with information about test scripts, runs, and results in ALM from Cyara,” the company wrote in its announcement.

In addition, Cyara is providing new integration from blackchair and inProd to enable users to embed automated CX testing into their rollout processes. The integrations will also be able to automatically trigger tests for near-instant validation of changes, the company explained.

Cyara is looking to add additional DevOps solutions for defect tracking, lifecycle management and IT ticketing systems in 2019.

The post Cyara brings DevOps to customer experience solution appeared first on SD Times.

]]>
Differentiating customer experience from traditional software user experience https://sdtimes.com/cx/differentiating-customer-experience-traditional-software-user-experience/ Wed, 06 Dec 2017 14:00:11 +0000 https://sdtimes.com/?p=28292 In recent years, developing a great user experience has become critical for success in software development. With so many different options for products, users have the power and freedom to choose to use those companies with which they have the best experience. “UX is an established discipline,” said Jason Moccia, CEO of OneSpring. “It has … continue reading

The post Differentiating customer experience from traditional software user experience appeared first on SD Times.

]]>
In recent years, developing a great user experience has become critical for success in software development. With so many different options for products, users have the power and freedom to choose to use those companies with which they have the best experience.

“UX is an established discipline,” said Jason Moccia, CEO of OneSpring. “It has been around for many years in the software development space. I think what’s happening now, and will happen over the next year or two, is it is becoming a more important component of the software development life cycle.” Companies are beginning to understand that if customers have a bad experience interacting with their product, they may not return. Having a bad user interface can be just as bad for business as having a bad product.

More recently, customer experience, or CX, has also been gaining popularity as a new trend in application development. “When I think of the definition between both of them, I think of CX as the kind of emotional side of how customers interact with a company, whereas UX is all about interaction,” said Moccia. The UX is what drives a customer or user to use a product, but the CX aspect is how that interaction makes them feel.

According to Moccia, a key part of CX is the journey map. which follows the journey of a customer as he or she interacts with a company’s product. In UX, they look at ‘personas,’ which are essentially representations of users. Both are very important to look at and take into consideration when developing software.

UX/CX are increasingly important especially due to the impact they can have on the success of a product. “What you are trying to do is impact the bottom line,” Moccia told SD Times. “You are trying to increase the emotional positivity somebody will have while interacting with your company.”

Now that UX/CX is so crucial to building software that users will love, how can companies fit it into their existing development cycle without having to reinvent the wheel? Moccia said that companies are still trying to figure out where CX fits in their organization. Since it is a relatively new concept, working it into organizations can be tricky. “I think over the next year there is going to be more of a definition within organizations on what CX is, what it looks like for a company, and who oversees CX within that company,” said Moccia.

According to Moccia, understanding the roles and responsibilities in regards to UX/CX is a challenge for many. Depending on what development life cycle discipline you follow, UX/CX will be addressed at different points. He gave the example of Agile development, where UX and usability testing would typically come after development. They said they are seeing more companies bringing UX/CX to the front and building a prototype to show to product users and then developing on that prototype. “You have to adapt to an organization and what they are trying to achieve on the customer experience side and make sure you get that right before you start building,” said Moccia.

Moccia says that development teams need to be open-minded to different disciplines in order to be successful in implementing good UX/CX. “There is a lot more that goes into it up front so when we talk to developers about it there is a general resistance because it alters the premise of Scrum in their mind,” he told SD Times.

“In software there is somewhat of a resistance because they look at it as a waterfall and what I always tell people is that it is not waterfall, you can still break apart user experience into iterative, bite-sized portions,” says Moccia. “We’re going to focus on just one portion of an application and really get that right and then give it to the development team to build and then we will focus on another one. So there are ways to slice this and I would say the challenge for developers, just being open to that and working really with an end in mind.”

The post Differentiating customer experience from traditional software user experience appeared first on SD Times.

]]>
DataStax updates its enterprise platform and unveils new strategy https://sdtimes.com/apache-cassandra/datastax-updates-enterprise-platform-unveils-new-strategy/ Wed, 15 Mar 2017 19:11:13 +0000 https://sdtimes.com/?p=24037 DataStax today announced the latest version of its data platform, DataStax Enterprise 5.1, as well as a new strategy to help enterprises design and implement customer experience  applications, called DataStax CX Data Solution. The new CX Data solution is made up of the DataStax Enterprise platform, centralized training and specialized services to help its customers … continue reading

The post DataStax updates its enterprise platform and unveils new strategy appeared first on SD Times.

]]>
DataStax today announced the latest version of its data platform, DataStax Enterprise 5.1, as well as a new strategy to help enterprises design and implement customer experience  applications, called DataStax CX Data Solution.

The new CX Data solution is made up of the DataStax Enterprise platform, centralized training and specialized services to help its customers build their own CX applications, such as marketing, sales and ecommerce applications, according to Robin Schumacher, senior vice president and chief product officer at DataStax. With these services, the goal is to continue to advance the data requirements of cloud applications, including CX applications.

With the new version of its DataStax Enterprise 5.1 product, teams get new features and functions that directly contribute to helping facilitate CX applications, like marketing, commerce and sales applications.

Some of the major highlights from the DataStax Enterprise 5.1 release includes high-performance, operational analysis with simplified security management. This release also includes simplified and tighter integration for mixed workloads, production-certified Spark 2.0 so customers can have enhanced operational analytics, production certification for Solr 6.0, and Row-Level Access Control (RLAC), which will provide lower-level access security management for Cassandra, making it easy to protect sensitive data and reduce the management overhead for deployments.

DataStax’s new strategy and solutions for CX applications benefits those responsible for running the business and ensuring the CX application that is being developed is actually attracting customers and retaining them in their customer loyalty program, Schumacher explained. Typically, these services and solutions attract system architects, designers, developers and operators.

According to Schumacher, the new solution also has features that directly facilitate customer experiences, and it has a customer-centric focus.

Schumacher also mentioned that while DataStax is a big contributor to the Apache Cassandra project, they are shifting resources and developing more commercial components for its DataStax customers, meaning these features and functions will not be given back to open source and will instead be reserved for DataStax customers.

One of the things DataStax has noticed with CX applications is that they need to be contextual, according to Schumacher. In other words, the application needs to be able to return information back to the end customer that fits the context they are in right now, whether they are in a certain location or they are buying certain products.

“The information that comes back needs to fit the context and that context is constantly changing,” said Schumacher. “What you have now is this blending of different workloads — a transactional aspect, an analytical aspect, a search aspect — and all these things are now coming together so you can pull off these very quick round trips that you need to make between these web and mobile apps.”

Schumacher added that it can be challenging to blend those different types of workloads, which is why DataStax’s new and upgraded solutions aim to pull all of that under one roof, so teams have one platform that does all of the work without the need to shift data back and forth.

“The real-time nature of it [is a benefit],” said Schumacher. “You want the information now, and you need the data to be real-time and that can be real challenge. Most of the database centers out there especially legacy, they can’t do this.”

The post DataStax updates its enterprise platform and unveils new strategy appeared first on SD Times.

]]>