Special Report Archives - SD Times https://sdtimes.com/category/special-report/ Software Development News Wed, 14 Sep 2022 18:05:31 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://sdtimes.com/wp-content/uploads/2019/06/bnGl7Am3_400x400-50x50.jpeg Special Report Archives - SD Times https://sdtimes.com/category/special-report/ 32 32 Automated testing still lags https://sdtimes.com/test/automated-testing-still-lags/ Tue, 02 Aug 2022 20:20:17 +0000 https://sdtimes.com/?p=48461 Automated testing initiatives still lag behind in many organizations as increasingly complex testing environments are met with a lack of skilled personnel to set up tests.  Recent research conducted by Forrester and commissioned by Keysight found that while only 11% of respondents had fully automated testing, 84% percent of respondents said that the majority of … continue reading

The post Automated testing still lags appeared first on SD Times.

]]>
Automated testing initiatives still lag behind in many organizations as increasingly complex testing environments are met with a lack of skilled personnel to set up tests. 

Recent research conducted by Forrester and commissioned by Keysight found that while only 11% of respondents had fully automated testing, 84% percent of respondents said that the majority of testing involves complex environments. 

For the study, Forrester conducted an online survey in December 2021 that involved 406 test operations decision-makers at organizations in North America, EMEA, and APAC to evaluate current testing capabilities for electronic design and development and to hear their thoughts on investing in automation.

The complexity of testing has increased the number of tests, according to 75% of the respondents. Sixty-seven percent of respondents said the time to complete tests has risen too.

Challenges with automated testing 

Those that do utilize automated testing often have difficulty making the tests stable in these complex environments, according to Paulina Gatkowska, head of quality assurance at STX Next, a Python software house. 

One such area where developers often find many challenges is in UI testing in which the tests work like a user: they use the browser, click through the application, fill fields, and more. These tests are quite heavy, Gatkowska continued, and when a developer finishes their test on a local environment, sometimes it fails in another environment, or only works 50% times, or a test works the first week, and then starts to be flaky. 

“What’s the point of writing and running the tests, if sometimes they fail even though there is no bug? To avoid this problem, it’s important to have a good architecture of the tests and good quality of the code. The tests should be independent, so they don’t interfere with each other, and you should have methods for repetitive code to change it only in one place when something changes in the application,” Gatkowska said. “You should also attach great importance to ‘waits’ – the conditions that must be met before the test proceeds. Having this in mind, you’ll be able to avoid the horror of maintaining flaky tests.”

Then there are issues with the network that can impede automated tests, according to Kavin Patel, founder and CEO of Convrrt, a landing page builder. A common difficulty for QA teams is network disconnection, which makes it difficult for them to access databases, VPNs, third-party services, APIs, and certain testing environments, because of shaky network connections, adding needless time to the testing process. The inability to access virtual environments, which are typically utilized by testers to test programs, is also a worry. 

Because some teams lack the expertise to implement automated testing, manual testing is still used as a correction for any automation gaps. This creates a disconnect with the R&D team, which is usually two steps ahead, according to Kenny Kline, president of Barbend, an online platform for strength sports training and nutrition.

“To keep up with them, testers must finish their cycles within four to six hours, but manual testing cannot keep up with the rate of development. Then, it is moved to the conclusion of the cycle,” Kline said. “Consequently, teams must include a manual regression, sometimes known as a stabilization phase, at the end of each sprint. They extend the release cadence rather than lowering it.”

Companies are shifting towards full test automation 

Forrester’s research also found that 45% of companies say that they’re willing to move to a fully automated testing environment within the next three years to increase productivity, gain the ability to simulate product function and performance, and shorten the time to market. 

The companies that have implemented automated testing right have reaped many rewards, according to Michael Urbanovich, head of the testing department at a1qa, an international quality assurance company. The ones relying on robotic process automation (RPA), AI, ML, natural language processing (NLP), and computer vision for automated testing have attained greater efficiency, sped up time to market, and freed up more resources to focus on strategic business initiatives. RPA alone can lower the time required for repetitive tasks up to 25%, according to research by Automation Alley. 

For those looking to gain even more from their automation initiatives, a1qa’s Urbanovich suggests looking into continuous test execution, implementing self-healing capabilities, RPA, API automation, regression testing, and UAT automation. 

Urbanovich emphasized that the decision to introduce automated QA workflows must be conscious. Rather than running with the crowd to follow the hype, organizations must calculate ROI based on their individual business needs and wisely choose the scope for automation and a fit-for-purpose strategy. 

“To meet quality gates, companies need to decide which automated tests to run and how to run them in the first place, especially considering that the majority of Agile-driven sprints last for up to only several weeks,” Urbanovich said. 

Although some may hope it were this easy, testers can’t just spawn automated tests and sit back like Paley’s watchmaker gods. The tests need to be guided and nurtured. 

“The number one challenge with automated testing is making sure you have a test for all possibilities. Covering all possibilities is an ongoing process, but executives especially hear that you have automated testing now and forget that it only covers what you actually are testing and not all possibilities,” said David Garthe, founder of Gravyware, a social media management tool. “As your application is a living thing, so are the tests that are for it. You need to factor in maintenance costs and expectations within your budget.” 

Also, just because a test worked last sprint, doesn’t mean it will work as expected this sprint, Garthe added. As applications change, testers have to make sure that the automated tests cover the new process correctly as well. 

Garthe said that he has had a great experience using Selenium, referring to it as the “gold standard” with regard to automated testing. It has the largest group of developers that can step in and work on a new project. 

“We’ve used other applications for testing, and they work fine for a small application, but if there’s a learning curve, they all fall short somewhere,” Garthe said. “Selenium will allow your team to jump right in and there are so many examples already written that you can shortcut the test creation time.”

And, there are many other choices to weave through to start the automated testing process.

“When you think about test automation, first of all you have to choose the framework. What language should it be? Do you want to have frontend or backend tests, or both? Do you want to use gherkin in your tests?,” STX Next’s Gatkowska said. “Then of course you need to have your favorite code editor, and it would be annoying to run the tests only on your local machine, so it’s important to configure jobs in the CI/CD tool. In the end, it’s good to see valuable output in a  reporting tool.”

Choosing the right tool and automated testing framework, though, might pose a challenge for some because different tools excel at different conditions, according to Robert Warner, Head of Marketing at VirtualValley, a UK-based virtual assistant company.

“Testing product vendors overstate their goods’ abilities. Many vendors believe they have a secret sauce for automation, but this produces misunderstandings and confusion. Many of us don’t conduct enough study before buying commercial tools, that’s why we buy them without proper evaluation,” Warner said. “Choosing a test tool is like marrying, in my opinion. Incompatible marriages tend to fail. Without a good test tool, test automation will fail.”

AI is augmenting the automated testing experience

In the next three years 52% of companies that responded to the Forrester report said they would consider using AI for integrating complex test suites.

The use of AI for integrated testing provides both better (not necessarily more) testing coverage and the ability to support agile product development and release, according to the Forrester report.

Companies are also looking to add AI for integrating complex test suites, an area of test automation that is severely lacking, with only 16% of companies using it today. 

a1qa’s Urbanovich explained that one of the best ways to cope with boosted software complexity and tight deadlines is to apply a risk-based approach. For that, AI is indispensable. Apart from removing redundant test cases, generating self-healing scripts, and predicting defects, it streamlines priority-setting. 

“In comparison with the previous year, the number of IT leaders leveraging AI for test prioritization has risen to 43%. Why so?” Urbanovich continued, alluding to the World Quality Report 2021-2022. “When you prioritize automated tests, you put customer needs FIRST because you care about the features that end users apply the most. Another vivid gain is that software teams can organize a more structured and thoughtful QA strategy. Identifying risks makes it easier to define the scope and execution sequence.”

Most of the time, companies are looking to implement AI in testing to leverage the speed improvements and increased scope of testing, according to Kevin Surace, CTO at Appvance, an AI-driven software testing provider

“You can’t write a script in 10 minutes, maybe one if you’re a Selenium master. Okay, the machine can write 5,000 in 10 minutes. And yes, they’re valid. And yes, they cover your use cases that you care about. And yes, they have 1,000s of validations, whatever you want to do. And all you did was spend one time teaching it your application, no different than walking into a room of 100 manual testers that you just hired, and you’re teaching them the application: do this, don’t do this, this is the outcome, these are the outcomes we want,” Surace said. “That’s what I’ve done, I got 100 little robots or however many we need that need to be taught what to do and what not to do, but mostly what not to do.”

QA has difficulty grasping how to handle AI in testing 

Appvance’s Surace said that the overall place of where testing needs to go is to be completely hands off from humans.

“If you just step back and say what’s going on in this industry, I need a 4,000 times productivity improvement in order to find essentially all the bugs that the CEO wants me to find, which is find all the bugs before users do,” Surace said. “Well, if you’ve got to increase productivity 4,000 times you cannot have people involved in the creation of very many use cases, or certainly not the maintenance of them. That has to come off the table just like you can’t put people in a spaceship and tell them to drive it, there’s too much that has to be done to control it.”  

Humans are still good at prioritizing which bugs to tackle based on what the business goals are

because only humans can really look at something and say, well, we’ll just leave it, it’s okay, we’re not gonna deal with it or say this is really critical and push it to the developers side to fix it before release, Surace continued. 

“A number of people are all excited about using AI and machine learning to prioritize which tests you should run, and that entire concept is wrong. The entire concept should be, I don’t care what you change in application, and I don’t understand your source code enough to know the impacts and on every particular outcome. Instead, I should be able to create 10,000 scripts and run them in the next hour, and give you the results across the entire application,” Surace said. “Job one, two, and three of QA is to make sure that you found the bugs before your users do. That’s it, then you can decide what to do with them. Every time a user finds a bug, I can guarantee you it’s in something you didn’t test or you chose to let the bug out. So when you think about it, that way users find bugs and the things we didn’t test. So what do we need to do? We need to test a lot more, not less.”

A challenge with AI is that it is a foreign concept to QA people so teaching them how to train AI is a whole different field, according to Surace. 

First off, many people on the QA team are scared of AI, Surace continued, because they see themselves as QA people but really have the skillset of a Selenium tester that writes Selenium scripts and tests them. Now, that has been taken away similar to how RPA disrupted many industries such as customer support and insurance claims processing. 

The second challenge is that they’re not trained in it.

“So one problem that we see that we have is you explain how the algorithms work?,” Surace said. “In AI, one of the challenges we have in QA and across the AI industry is how do we make people comfortable that here’s a machine that they may not ever be able to understand. It’s beyond their skillset to actually understand the algorithms at work here and why they work and how neural networks work so they now have to trust that the machine will get them from point A to point B, just like we trust the car gets from point A to point B.”

However, there are some areas of testing in which AI is not as applicable, for example, in a form-based application where there is nothing else for the application to do than to guide you through the form such as in a financial services application. 

“There’s nothing else to do with an AI that can add much value because one script that’s data-driven already handles the one use case that you care about. There are no more use cases. So AI is used to augment your use cases, but if you only have one, you should write it. But, that’s few and far between and most applications have hundreds of 1,000s of use cases perhaps or 1,000s of possible combinatorial use cases,” Surace said. 

According to Eli Lopian, CEO at Typemock, a provider of unit testing tools to developers worldwide, QA teams are still very effective at handling UI testing because the UI can often change without the behavior changing behind the scenes. 

“The QA teams are really good at doing that because they have a feel for the UI, how easy it is for the end user to use that code, and they can see the thing that is more of a product point of view and less of doesn’t work or does it not work point of view, which now is really it’s really essential if you want to an application to really succeed,” Lopian said. 

Dan Belcher, the co-founder at mabl, said that there is still plenty of room for a human in the loop when it comes to AI-driven testing. 

“So far, what we’re doing is supercharging quality engineers so human is certainly in the loop, It’s eliminating repetitive tasks where their intellect isn’t adding as much value and doing things that require high speed, because when you’re deploying every few minutes, you can’t really rely on a human to be involved in that in that loop of executing tests. And so what we’re empowering them to do is to focus on higher level concerns, like do I have the right test coverage? Are the things that we’re seeing good or bad for the users?,” Belcher said.

AI/ML excels at writing tests from unit to end-to-end scale

One area where AI/ML in testing excels at is in unit testing on legacy code, according to Typemock’s Lopian.

“Software groups often have this legacy code which could be a piece of code that maybe they didn’t do a unit test beforehand, or there was some kind of crisis, and they had to do it quickly, and they didn’t do the test. So you had this little piece of code that doesn’t have any unit tests. And that grows,” Lopian said. “Even though it’s a difficult piece of code, it wasn’t built for testability in mind, we have the technology to both write those tests for those kinds of code and to generate them in an automatic manner using the ML.”

The AI/ML can then make sure that the code is running in a clean and modernized way. The tests can refactor the code to work in a secure manner, Lopian added. 

AI-driven testing is also beneficial for UI testing because the testers don’t have to explicitly design the way that you reference things in the UI, you can let the AI figure that out, according to mabl’s Belcher. And then when the UI changes, typical test automation results in a lot of failures, whereas the AI can learn and improve the tests automatically, resulting in 85-90% reduction in the amount of time engineers spend creating and maintaining tests with AI. 

In the UI testing space, AI can be used for auto healing, intelligent timing, detecting visual changes automatically in the UI, and detecting anomalies and performance. 

According to Belcher, AI can be the vital component in creating a more holistic approach to end-to-end testing. 

“We’ve all known that the answer to improving quality was to bring together the insights that you get when you think about all facets of quality, whether that’s functional or performance, or accessibility, or UX. And, and to think about that holistically, whether it’s API or web or mobile. And so the area that will see the most innovation is when you can start to answer questions like, based on my UI tests, what API tests should I have? And how do they relate? So when the UI test fails? Was it an API issue? And then, when a functional test fails, did anything change from the user experience that could be related to that?,” Belcher said. “And so the key is to do this is we have to bring kind of all of the kind of end-to-end testing together and all the data that’s produced, and then you can really layer in some incredibly innovative intelligence, once you have all of that data, and you can correlate it and make predictions based on that.”

6 types of Automated Testing Frameworks 
  1. Linear Automation Framework – Also known as a record-and-playback framework in which testers don’t need to write code to create functions and the steps are written in a sequential order. Testers record steps such as navigation, user input, or checkpoints, and then plays the script back automatically to conduct the test.
  2.  Modular Based Testing Framework – one in which testers need to divide the application that is being tested into separate units, functions, or sections, each of which can then be tested in isolation. Test scripts are created for each part and then combined to build larger tests. 
  3. Library Architecture Testing Framework – in this testing framework, similar tasks within the scripts are identified and later grouped by function, so the application is ultimately broken down by common objectives. 
  4. Data-Driven Frameworktest data is separated from script logic and testers can store data externally. The test scripts are connected to the external data source and told to read and populate the necessary data when needed. 
  5. Keyword-Driven Framework – each function of the application is laid out in a table with instructions in a consecutive order for each test that needs to be run. 
  6. Hybrid Testing Framework – a combination of any of the previously mentioned frameworks set up to leverage the advantages of some and mitigate the weaknesses of others.

Source: https://smartbear.com/learn/automated-testing/test-automation-frameworks/

The post Automated testing still lags appeared first on SD Times.

]]>
A role in identity verification https://sdtimes.com/ai/a-role-in-identity-verification/ Thu, 05 Mar 2020 17:35:55 +0000 https://sdtimes.com/?p=39171 Previous methods of identity verification aren’t as efficient today when there is so much more data in circulation. Augmented intelligence has entered the arena to provide much more accurate solutions for ID verification.  Previous methods just required basic information of where someone lives and the applications would then just check a database. One company called … continue reading

The post A role in identity verification appeared first on SD Times.

]]>
Previous methods of identity verification aren’t as efficient today when there is so much more data in circulation. Augmented intelligence has entered the arena to provide much more accurate solutions for ID verification. 

Previous methods just required basic information of where someone lives and the applications would then just check a database. One company called Jumio utilizes pictures of government-issued IDs to verify whether this is in fact the correct person, who then has to take a selfie that matches the original ID.

RELATED CONTENT: Augmented intelligence will help, not replace, human workers

AI can adjust and see the different security features that are particular to passports from certain areas to see if it’s legitimate, whether that’s a watermark or a photo, or the ID number. Where AI falls down is where AI can’t read the security feature, whether that’s due to fluorescent lighting or glare found in the picture, said Dean Nicolls, the vice president of global marketing at Jumio. 

“When AI sees an ID and there’s a lot of glare or a lot of blur on it, the first thing it’s going to say is I can’t read it,” Nicolls said. “Solutions that are 100% automated that rely on AI completely are going to essentially say, sorry, I can’t read the image and I’m going to return a nonrenewable.”

“So that leads to a really bad customer experience,” Nicolls continued. “And for our business customers, like in this case, the bank, now they’re put in a very bad position because now they’re being told that 30% of the transactions are unreadable.”

Jumio’s solution employs actual human agents who are trained to find out where the AI fell short and to then render a decision. The AI additionally tells the agent specifically where its capabilities fell short. For example, it says there was glare on the face due to the overhead lighting. So now the agent only needs to look at the image on the face, rather than the whole ID. 

“The augmented technology helps our agents, guides them and tells them where to look and what to review to ultimately render a yes or no verification decision,” Nicolls said. 

Another way in which humans help train the AI is by looking at the models created for learning to help determine where AI fell short, creating a real-time feedback loop built into the process, Nicolls explained. 

“Essentially the way AI works is you throw a bunch of data at it and then you let the data determine the algorithms. Not only do our agents decide by giving a yes or no decision, but they are letting us know where the AI fell short. And the fact that we have human agents that are looking at a portion of those that are instructing us where the algorithms fell short, that’s also speeding up the learning process,” Nicolls said. 

Nicolls said that the goal is to eventually take AI algorithms from 20% of cases that are currently unreadable down to only 5% or 10%. However, the role of verification experts is still important in making that AI iteratively better over time. 

Augmented intelligence can prove useful in areas where the data isn’t perfect. For example, in radiology there are perfect chest scans that augmented intelligence works off of and in that case it could almost replace the radiologist, according to a Stanford study. However, with things like ID verification a lot of the times the data isn’t perfect.

“People think of AI as being a panacea and they think that AI can solve all the world’s problems, right? And in many cases it can solve a lot of problems when you have a lot of perfect data,” Nicolls said. “Most of the situations where I see AI applied, data isn’t perfect and you deal with some uncleanliness and this is where augmented intelligence really, really helps.”

The post A role in identity verification appeared first on SD Times.

]]>
Augmented intelligence will help, not replace, human workers https://sdtimes.com/ai/augmented-intelligence-will-help-not-replace-human-workers/ Thu, 05 Mar 2020 15:23:55 +0000 https://sdtimes.com/?p=39170 Augmented intelligence is growing as an approach to artificial intelligence, in a way that helps humans complete tasks faster, rather than being replaced by machines entirely.  In an IBM report called “AI in 2020: From Experimentation to Adoption,” 45% of respondents from large companies said they have adopted AI, while 29% of small and medium-sized … continue reading

The post Augmented intelligence will help, not replace, human workers appeared first on SD Times.

]]>
Augmented intelligence is growing as an approach to artificial intelligence, in a way that helps humans complete tasks faster, rather than being replaced by machines entirely. 

In an IBM report called “AI in 2020: From Experimentation to Adoption,” 45% of respondents from large companies said they have adopted AI, while 29% of small and medium-sized businesses said they did the same. 

All of these companies are still in the early days of AI adoption, and are looking for ways to infuse it to bolster their workforce.  

Ginni Rometty, the former CEO of IBM, said in a talk at the World Economic Forum that augmented intelligence is the preferable lens through which to look at AI in the future.

RELATED CONTENT:
A role in identity verification
Implications and practical apps for AI and ML in embedded systems

“I actually don’t like the word AI because cognitive is much more than AI. And so, AI says replacement of people, it carries some baggage with it and that’s not what we’re talking about,” Rometty said. “By and large we see a world where this is a partnership between man and machine and that this is in fact going to make us better and allows us to do what the human condition is best able to do.” 

Augmented intelligence is a cognitive technology approach to AI adoption that focuses on the assistive role of AI.

“I would explain augmented intelligence as something where you are augmenting a human being to do their tasks a lot better or more efficiently,” said Dinesh Nirmal, the vice president of Data and AI Development at IBM. “You’re not replacing anyone, but you are augmenting the skill set of that individual.”

The choice of the word augmented, which means “to improve,” reinforces the role human intelligence plays when using machine learning and deep learning algorithms to discover relationships and solve problems.

While a sophisticated AI program is certainly capable of making a decision after analyzing patterns in large data sets, that decision is only as good as the data that human beings gave the system to use.

Full automation is a ‘delusion’
“Full automation is a myth,” said Svetlana Sicular, research vice president at Gartner. “There is quite a bit of delusion that everything can be automated. There are certain things that can be automated, but humans have to be in the loop.”

She added that the majority of situations for creating full automation are very expensive and very hard to reach. 

“Once in a while, AI will go berserk for multiple reasons simply because I’m using it at the edge. So if you consider my phone the edge, I might lose the connection or there might be too many people in this area. Like in one instance, my navigation kept telling me on the highway to turn left while there was nowhere to turn left and I was actually thinking, ‘What if I am in an autonomous car?’ ” Sicular said. 

There are tasks that require expertise and decision-making that can only be accomplished by the essential creativity that only humans could bring to the table, according to David Schubmehl, research director of Cognitive/AI Systems at IDC. 

“AI is really an assistant to help you get done with the mundane and mindless tasks more quickly so you can focus on the more challenging creative aspects,” Schubmehl said. 

Autonomous AI is being used across organizations that typically require very repeatable tasks such as customer churn and telecommunications, recommendations in retail and supply chain, Sicular mentioned. 

Since AI adoption is in its early stages, enterprises don’t necessarily know whether adopting AI models would greatly expedite their efficiency. Sicular said that first there must be a lot of analysis as to whether AI is really worth adopting for certain use cases. 

Sicular also said that there are two large trends happening in the world of AI: the industrialization of machine learning and AI to make them better for scaling, and also the democratization of AI to spread the benefits of the technology evenly.

AI’s move to industries has led companies to look for an all-in-one solution. 

“Data scientists are far fewer compared to developers. That’s why there’s a big effort to try to deliver some kinds of AI in a box that could be slightly modified by developers, and another big effort is how to scale it,” Sicular said. 

Up until recently, all machine learning was done manually, which means there were PhDs who could develop their own custom algorithms. And most of those algorithms were not developed and deployed at scale, according to Sicular. 

“You buy a service off the shelf, you go through the crowd, you can get image recognition, speech recognition, forecasting, text analytics. All of this is available without having specialists in your organization or skilled people and so on,” Sicular said. “But the question that’s at the core of augmented intelligence is how this is being adopted and for what purposes.” 

Augmented intelligence can be seen in sales coaching in which inside sales employees are getting advice as they talk to customers. 

In healthcare, AI is used to help doctors or specialists find some of the things that they have missed rather than sift through hundreds of thousands of documents.

AI for IT is called AIOps, which is augmenting the workload of managers and IT workers by helping them detect anomalies or disruptions much faster. 

“A lot of customers are having trouble with the amount of data that’s coming in,” IBM’s Nirmal said. “Let’s say you have IoT data, transaction data, behavioral data, social data and other places, and the question is how do you do data discovery and its classification. At least every enterprise that we are working with, there’s a lot of interest in adopting AI.”

An example of augmenting would be a data engineer finding data that looks like code but was put into the zip code field. The data engineer can then determine whether the data makes sense in its place. If it’s not a zip code for example, but instead a social security number, then the data engineer can go and change it. The machine learning model will then know that this number is not a zip code for next time. 

Another big area of interest is in creating alerts that can detect anomalies and can be used in data centers, according to Nirmal. 

“Customers are always wanting to figure out a problem before it happens. So anomaly detection becomes pretty critical to make sure that you see a lot of alerts for our data coming or logs coming in,” Nirmal said. “There are some tasks such as fraud detection in which AI tends to generate a lot of false alerts and humans with deep vertical knowledge need to then oversee the process.”

Augmented intelligence can also refer to tools such as AI transcriptions from meetings or add-ons to PowerPoint that make recommendations on how to improve the slides as one goes  through them. 

Developers also have access to tools that use AI to create more efficient code reviews to speed up the SDLC. For example, DeepCode shows errors in code based on troves of similar scenarios that occured before, and then provides context on how to fix them. 

“What Grammarly does for written texts, we do the exact same thing for developers,” said Boris Paskalev, the co-founder and CEO of DeepCode.  “There are many areas where I believe that augmented intelligence actually helped developers because now we have a machine learning representation for code. In the DeepCode platform, you can really add any service on top of it because you really have the knowledge of the global development community that you can index in real-time. So we can get the answers in seconds, which is quite exciting, considering these capabilities did not exist just a couple of years ago.”

All in all, companies are growing their investments in AI and it is becoming a fundamental part of every industry’s digital transformation, according to IDC’s Schubmehl. 

“Amazon wouldn’t be where it was without machine learning and AI, because it’s a huge part of its offering,” Schubmehl said. “We’re really in the early days. We’ve finally gotten out of the prototyping phase and we’re actually starting to put real AI and machine learning into production. I think we still have years and years to go before AI is what I would call fully mature.”

The post Augmented intelligence will help, not replace, human workers appeared first on SD Times.

]]>
premium Productivity tools are crucial in the current development landscape https://sdtimes.com/softwaredev/productivity-tools-are-crucial-in-the-current-development-landscape/ Fri, 10 Jan 2020 14:30:20 +0000 https://sdtimes.com/?p=38460 As development teams work to ship code faster and faster, streamlining development workflows is crucial. Developers don’t just write code all day. There are other tasks developers spend time on that may be slowing them down and preventing them from doing the work that adds value to a company.  According to ActiveState’s 2019 Open Source … continue reading

The post <span class="sdt-premium">premium</span> Productivity tools are crucial in the current development landscape appeared first on SD Times.

]]>
As development teams work to ship code faster and faster, streamlining development workflows is crucial. Developers don’t just write code all day. There are other tasks developers spend time on that may be slowing them down and preventing them from doing the work that adds value to a company. 

According to ActiveState’s 2019 Open Source Runtime Pains Developer Survey, 36.8% of developers spend two to four hours per day coding, and only 10.56% spend all of their day coding. Non-coding time is typically spent on tasks such as software design or attending meetings.

As the need for being more productive has grown, so has the availability of solutions to help developers be more productive and collaborate more easily.

These tools come in all shapes and sizes. There are tools that are designed specifically with productivity and collaboration in mind, such as Slack, Microsoft Teams, and Trello. But it is also common to find development tools with productivity features baked in, such as IntelliJ IDEA, CodeStream, or ZenHub. 

RELATED CONTENT: 
For development teams, it’s time to throw out the open-office plan
Crunch culture can destroy development teams
Three steps to reduce burnout

Developers need to work together on projects often, and having a tool that makes communication easier and more transparent is beneficial. According to Mike Ammerlaan, director of Office & SharePoint ecosystem marketing at Microsoft, email isn’t a great system for this for several reasons, including that it doesn’t provide a way for people to specify how they want to be notified in threads and it isn’t a great format for recording knowledge. 

Another advantage of communication tools like Slack and Microsoft Teams is the ability to create custom channels. “With the custom channels, what you can do is you can further subdivide in the processes within your developer teams,” said Ammerlaan. “So for example, [you can] have a channel for people to come by and report bugs or a channel for having post-mortem conversations after an incident, or dealing with incident response.”

These platforms also offer a number of different integrations targeted at developers so that different aspects of the development life cycle can be tied back into the chosen communication platform. “The idea is that you can bring those applications into Microsoft Teams and connect them into the conversations, connect them into the workflows, and sort of weave them into a tailored space that fits the engineering teams.”

Microsoft is also continuing to innovate the platform and extend the surfaces that developers can customize, Ammerlaan explained. For example, in November at Microsoft Ignite, the company added capabilities such as secure private channels, multiwindow chats and meetings, pinned channels, and task integration with To Do and Planner. 

Apart from communication platforms, project management tools are also heavily used by developers to keep projects on track. Trello is especially popular because it essentially allows for a personal Kanban workflow that is completely customizable based on a person’s working preferences. “Trello has become my most favorite to tackle all my pending tasks in a day while maintaining a normal routine,” said Mehul Rajput, CEO and co-founder at Mindinventory. “I can classify my tasks as ‘To Do’, ‘In Process’ and ‘Done’ categories. 

Peter Wilfahrt, chief digital officer and co-founder of German e-commerce agency Versandgigant, also praised Trello for its ability to help keep track of project goals. To keep himself organized, he has organized his board into “Projects,” “Next Actions,” “Waiting For,” and “Someday/Maybe.” He also connected Integromat to Trello, which adds items to Trello, even further increasing productivity. 

Tools like this can end up being so central to a person’s or an entire team’s workflow, too. For example, Scott Kurtzeborn, an engineering manager on the Microsoft Azure team, explained that his team has been using ZenHub to track all of their work since the team was formed. “[Without ZenHub], it would just be a mess,” said Kurtzeborn. “I can’t imagine not managing our backlog the way we are … How has it affected our productivity? It’s kind of at the center of it, in terms of how we track our work.” Other project tracking and management software includes Atlassian Jira, Anaxi and Clubhouse, each geared to making developers more productive.

There are also things that one might not immediately think of as a productivity tool, but that boost productivity in other ways. For example, Rajput uses an app called F.lux, which automatically changes the color of his computer screen based on the current time and his location. “As a developer, I need to spend so much time with my eyes on the screen. And for me, working with tired and dry eyes is a real obstacle. So, I use F.lux to help myself remain relaxed … Warm colors help to work longer by making the work environment more pleasant and natural to my eyes.” 

IDEs also fall into this category because of helpful features like autocomplete and syntax highlighting. “Most of that streamlining occurs by virtue of replacing mouse operations with keyed commands that are faster to execute,” said Jeff Langr, owner of training and consulting company Langr Software Solutions

No-code and low-code tools also help with productivity. These tools typically feature drag-and-drop components, which allow developers or business users to easily create simple applications, often without having to write a single line of code. 

Low-code and no-code are often marketed to business users, or “citizen developers,” allowing non-developers to create business applications without having to know how to write code. But these solutions can also be strategically used by developers to cut down on production time. 

In fact, low-code provider Altova offers several developer-focused features in its low-code platform, MobileTogether. Its “program once, run everywhere” environment provides developers with an abstraction layer between them and the native SDKs and APIs. This helps developers cut down on time and effort by reducing the amount of code they need to write, Altova’s CEO Alexander Falk explained to SD Times.

According to Langr, productivity tools can help developers spend more time on the things that matter. “It may seem silly to worry about such small amounts of time, but they really do add up.”

Rajput recalls many days where he and his team were in the right frame of mind to do their best work, but small hindrances got in the way. 

In addition to the added productivity, these tools may have other benefits, such as documentation that might not otherwise have existed. For example, Wilfahrt keeps track of his work in Trello, and by forwarding every request into a single source of truth, he can document change requests and keep track of improvements of delays. “A long-term record will allow you to re-evaluate predictions (how long will it take to implement X?).”

What do developers look for from a productivity solution?
When searching for new tools, Langr finds that tools that help with small tasks are more helpful than ones trying to help with big-picture challenges. He also seeks out tools that don’t force him into a certain workflow. 

Views like this explain Slack’s popularity among developers. In the Slack App Directory, you can find a very specific app for accomplishing a specific task, but that app will still be grouped into a single platform. According to Bear Douglas, director of developer relations at Slack, the platform just surpassed 2,000 apps in its app directory, and over 500,000 custom integrations are actively used on a weekly basis. 

“When you think about what it’s like in your everyday workday, I would guess that you probably use upwards of a dozen just between communication with people, writing things, your code editor, etc,” said Douglas. “It adds up quickly and I think that one of the things that has made Slack so powerful is that we’ve been the communication hub and the nexus for all of these different services.”

Wilfahrt has a similar view when looking for tools. “I love productivity tools that are easy to use and don’t require to change your current way of working … The interface and usage has to be simple but powerful,” he said.

In addition, because Wilfahrt’s goal with these types of tools is to clear his mind and limit distractions, the ability to keep track of meeting outcomes and add new data points, such as requests, improvements, critical issues and reprioritization, is a must for him.

Wilfahrt also recommends developers find one system and stick to it. “We software developers are happy to explore the newest hype and the newest promise,” he said. But changing your productivity system every month is brutal, he explained. “Implement your vision in one system, adjust it to your needs and stick to it until you find one that can do everything that your current system can plus your new requirements. Only if you can check off all those boxes are you allowed to initiate a change.”

Perhaps more important than finding the right tools is mastering them, Langr explained. “Master the wonderful tools that Unix distributions provide, but particularly master your editor, whether it be emacs, vim, or bare-bones editing in IDEA,” Langr said. “I’m always amazed at how inefficient too many veteran developers are when it comes to editing their code …  Master your tools. It’s well worth the investment.”

He added that once he masters a tool, the tool itself moves out of the way too. “I can think about my real goals instead of the rote steps needed to accomplish them,” said Langr. 

By implementing and then mastering these tools, developers can significantly reduce the amount of unnecessary work that interferes in their day-to-day work, and focus instead on creating value for the business. 

The post <span class="sdt-premium">premium</span> Productivity tools are crucial in the current development landscape appeared first on SD Times.

]]>
No-code mobile app development: Do more with less https://sdtimes.com/mobile/no-code-mobile-app-development-do-more-with-less/ Wed, 04 Dec 2019 16:00:07 +0000 https://sdtimes.com/?p=38053 Developers are tired of switching their focus back and forth between projects, and business folks are tired of waiting for developers to get their projects — but it doesn’t have to be this way. The rise of mobile development is enabling more work to get done on the fly, and the explosion of no-code development … continue reading

The post No-code mobile app development: Do more with less appeared first on SD Times.

]]>
Developers are tired of switching their focus back and forth between projects, and business folks are tired of waiting for developers to get their projects — but it doesn’t have to be this way.

The rise of mobile development is enabling more work to get done on the fly, and the explosion of no-code development tools in the marketplace is enabling business users to create their own mobile solutions without relying on developers.

RELATED CONTENT: How no-code disrupts traditional mobile code-based app development

“There are many businesspeople who face specific problems, and although they may have the ideas to overcome these hurdles, they are dependent on IT teams to turn them into practical solutions. If you train your employees to solve their own IT-related problems, they can become more versatile, independent, and overall way more successful at their job,” said Chris Obdam, CEO of the no-code enterprise app development tool provider Betty Blocks.

The need for no-code development has come out of the demand for digital capabilities, said Jeffrey Hammond, vice president and principal analyst for Forrester. There are just not enough computer science professionals available, he explained.

In a recent Forrester mobile executive survey, when asked how many technical resources respondents had dedicated to mobile, on average most organizations only had around 20 technical people for mobile. “If you are trying to build native applications with that kind of staff, you are lucky if you are going to get more than five applications out. If you are a large organization, you simply don’t have the amount of technical development resources available to build and maintain these applications. Getting the business users involved and helping them meet their needs, even if the app might be a little simple or not have quite the fidelity of a full blown native app, it still provides a lot of value to organizations,” said Hammond.

According to Obdam, there are various definitions of no-code, so it is important to note what no-code actually means. “No-code generally implies that you can build an application without the need for traditional programming. But the type of applications that no-code can build are very diverse, depending on the provider,” he said.

No-code isn’t just for business users
Typically no-code solutions are targeted at business users while low-code solutions are targeted at developers. No-code enables business users to create apps with no programming experience while low-code gives developers the option to manipulate the code.

“For the end-user there is no difference. It’s the process of building the application that varies. If your IT team doesn’t have a problem with working with new tooling that hardly requires any coding from them, then it’s a great solution. However, if your developers really enjoy traditional programming it might be more sensible to opt for a low-code solution,” said Chris Obdam, CEO of the no-code tool provider Betty Blocks.

While there is a lot you can do within no-code solutions, sometimes there is still a need to drop into and manipulate the code, said Jeffrey Hammond, vice president and principal analyst for Forrester, so experienced developers tend to stay away from the no-code solutions.

According to Obdam, native mobile development still “requires real craftsmanship,” and seasoned developers prefer to take advantage of underlying technology and techniques. “There are many traditionally schooled developers who enjoy the fact that they don’t have to code as much as they used to, whereas others might find it harder to distance themselves from coding. For them, a full-on switch to no-code might be a bit much.”

Obdam added that no-code is a good option for novice developers who want to get their hands dirty with mobile development. “Although the chances of adopting a no-code solution are smaller among seasoned mobile developers, no-code does enable less experienced developers to start tinkering with mobile application development. Making a mobile app used to be reserved to a select few, but with no-code tooling, you see that many more people become very capable of building their own mobile applications.”

The rise of no-code in mobile
A major trend happening in the mobile space today is that organizations are actually starting to look deeper into why they are building mobile applications as opposed to just building them and hoping things will turn out right, Forrester’s Hammond explained.

“In the early days, companies said they had to have a mobile app because the competitors had a mobile app…almost like that State Farm commercial where the agent says ‘well I have a mobile app, too’ even though he doesn’t,” he explained.

What is beginning to happen is organizations are concentrating on their mobile spending, and since organizations usually only have enough in the budget to do a couple of native mobile apps a year, they are starting to seriously consider no-code. “All the things that employees need or use often go unfulfilled or they are not maintained even if they are built because there are just not enough resources to do them, so it creates a need for no-code,” said Hammond.

In addition, almost everyone has a mobile device today. It is an intimate form of enabling communications and that is something businesses really want to take advantage of but haven’t been able to, Hammond explained.

Because of this problem, business’ software needs are not being met. Mobile apps are becoming time consuming and too expensive to build. “There’s a whole lot of smart people working in every company, in every line of business within the company, who see software as an answer to a wide variety of their problems — whether it’s about too many manual processes, or lack of appropriate data collection, analytics and insight from the data, or the lack of intelligence in a lot of the processes. So all of those opportunities exist, and the demand for software is growing. Yet, there’s no fundamental change. Software is actually becoming tougher and more expensive to build,” said Praveen Seshadri, founder and CEO of no-code platform AppSheet.

One option is to buy something off the shelf, but it rarely ever works out, Seshadri explained, because every business has unique and specific needs. “It’s this confluence of forces where everybody’s got a device, everybody can visualize quite easily the applications they might build on the factory floor, in a warehouse, out on the farm, or whatever it is. And yet it’s increasingly incredibly expensive, time consuming, to build these applications, so they’re not getting built. So that’s this pressure situation, and that’s why you’re seeing a lot of talk about low-code and no-code,” Seshadri said.

The applications being built with no-code are typically not applications that are going to be used by millions of end users, they are primarily built for internal productivity, Betty Block’s Obdam explained. “No-code is especially useful for mobile apps that focus on small processes for specific types of employees. A dock worker for instance, spends most of his working hours outside carrying only a mobile phone or tablet. It becomes interesting when this person can oversee all the processes relevant to him while on the go,” he said.

Seshadri explained that no-code can actually speed up mobile development processes and free up IT workers, who were forced to build these business apps because they were the only ones who could create the software. By empowering a business user to create an application you actually empower the people who understand the problem to turn their ideas into reality, Seshadri said.

“It becomes faster to build because there is no longer any back and forth between the business and IT department on what needs to be built and why,” said Seshadri. “Why would they ever want the developer? Even if a developer was available, it’s still way faster, and you’re not translating this how to somebody else’s going to do it, and then give it back to you, and go back through this process again when you could do it yourself. So, no business user would want to do it if they can build the apps themselves. Or somebody in their organization can build the apps themselves. So it’s actually somebody who understands the part of the business, the process they are actually in.”

Obdam added, “Long story short, no-code allows you to create specialized mobile apps that are part of a larger ecosystem. That’s where a no-code platform can truly excel: You build a platform supporting multiple mobile apps that are linked to a central back-office. Every part of the ecosystem can be created with no-code. In that way, no-code facilitates the entire process, which also makes it easy to oversee and govern each component.”

The no-code mobile app process
No-code mobile apps are often built by business users for business users, meaning it doesn’t go beyond the organization’s employees. However, that doesn’t mean that it doesn’t have to undergo the same scrutiny that a consumer-facing mobile app does.

According to Forrester’s Hammond, there has to be some level of governance put in place as organizations begin to expand their use of no-code. There needs to be someone in charge to offer advice and help during the process, and there needs to be rules and standards about what apps should be created and how they should be created. In addition, the IT department should not be left out of the loop, especially if you expect them to maintain the applications.

“IT ends up playing catch up because they weren’t aware that this was going on or are asked to maintain these apps. It is better to be proactive about that governance process because in general you can put an app out without doing any traditional testing,” said Hammond.

From a security perspective, business users need to be aware of compliance, security, and risk management policies. “From a security perspective, if you have people punch holes in your existing security posture to get access to data and you are not on top of that, then you increase your threat surface without even realizing it,” said Hammond.

AppSheet’s Seshadri believes all this can be solved with the proper platform. “The fact of the matter is code is incredibly hard. Performance tuning, very hard. Making sure things are secure. Very hard. Honestly, you do not want developers writing that code. Because almost always every 20 lines of code has a bug in it. What you want is a platform to give you security by default. You want a platform to give you a scale and performance by default. Without letting them mess it up. And that’s what no-code platforms do,” he said.

Similarly to how users don’t worry about security or performance when using something like Google Docs or Office 365, business users should not have to worry about implementing security when developing no-code apps. “They should be given an abstraction, and a model that says, here are the security abstractions,” said Seshadri.

Additionally, a platform should enable business users to go in and make changes, updates and add new features once the application is built. “If you enable a business user to build whatever you define as their first version of their application, it’s the equivalent of giving a mouse a cookie. Because their needs don’t stop with just the mobile app, or all the workflow messaging. But their ambitions around the apps they need to build,” Seshadri explained. “There’s a whole end-to-end system around it. And that end-to-end system needs to be built. You need to be able to update it. You need to be able to manage the data, do whatever ETL, archiving, reporting, analytics, auditing, scale it out if they have more people, deploy it in different languages… All of that needs to be available to the business user, but ultra simplified.”

No-code mobile apps in the real world
It is easy to talk about the benefits of no-code development for mobile applications, but how do these applications actually look and behave in the real world?

Tutti Gourmet, a gluten-free and allergen-free cookie and snack manufacturer, recently turned to no-code development to automate several processes. “I consider myself a moderate to advanced Excel user,” said Elijah Magrane, operations director at Tutti Gourmet. “When I came here, everything was done manually by hand—either with paper or by physically entering data into a spreadsheet. So, my first order of business when I started was to overhaul the process.”

Some of the solutions Magrane created with Appsheet’s tool included a timesheet for employees to track hours, production logs and summaries, and production, inventory and documentation applications.

“If we’re not up on our documentation, we could get a recall, which would probably put us out of business,” said Magrane. “Now, I receive notifications when expiration dates are approaching. This way, I can stay on top of all our documentation for our suppliers. This has been really really helpful.”

In an effort to undergo a digital transformation, energy company Kentucky Power recently turned to no-code to help move away from paper and digitize processes such as automating anything from inspection and incident reports to employee communications. Some of the prerequisites the company were looking for in a mobile development solution was that it had to have a built-in scanner to track serial numbers, enable users to create new forms and work orders as well as update existing ones, and be able to develop fast. With no-code, the company was able to create apps that tracked failed or damaged electric poles, transformers and circuits.

“As a pilot, we started with the Transformer Tracker in our Ashland shop with three or four foremen, who have to collect information about transformers,” Paula Bell, a lean team member for Kentucky Power, said in a case study. “After about a week of use, a foreman from another shop called me, and said ‘Hey Paula, can I have that thing that Rick uses to get serial numbers?’ When someone asks to start using a new tool based on another user sharing it, to me, that’s success!”

Lastly, KLB Construction turned to no-code because it was cheaper and more effective to build its own custom solution that met its specific needs than to buy the construction management software solutions already available. Some of the applications the company created included field management, daily reports, reimbursements, near misses and incidents and safety alerts.

“Everything is starting to get connected and even an underserved industry like construction is going to have to adopt new technology to stay with the curve. Larger companies have more resources and seem more willing. KLB is already an early adopter in construction technology and that has made us far more efficient and productive,” Richard Glass, director of information services at KLB, said in a case study..

The post No-code mobile app development: Do more with less appeared first on SD Times.

]]>
Don’t do Agile, be Agile https://sdtimes.com/agile/dont-do-agile-be-agile-2/ Tue, 05 Nov 2019 14:30:55 +0000 https://sdtimes.com/?p=37699 Despite what you may have heard, Agile is not dead. A couple years ago, Dave Thomas, one of the creators of the Agile manifesto, declared that Agile was dead, but it wasn’t the idea of Agile he was talking about. It was the word Agile itself.   “The word ‘agile’ has been subverted to the point … continue reading

The post Don’t do Agile, be Agile appeared first on SD Times.

]]>
Despite what you may have heard, Agile is not dead. A couple years ago, Dave Thomas, one of the creators of the Agile manifesto, declared that Agile was dead, but it wasn’t the idea of Agile he was talking about. It was the word Agile itself.  

“The word ‘agile’ has been subverted to the point where it is effectively meaningless, and what passes for an agile community seems to be largely an arena for consultants and vendors to hawk services and products,” he wrote in a post

The core principles of Agile are still important for helping organizations deliver value more efficiently and effectively, but somewhere along the way the word Agile has become corrupt and depreciated that it causing confusion amongst the industry, he explained. For instance, there are no Agile programmers or Agile teams, there are programmers and development teams who practice agility, and tools that help them increase that agility. 

The industry is constantly looking for ways to build on their Agile successes and scale it throughout the rest of the organization, but if they don’t understand the core principles of Agile in the first place, it can be nearly impossible to do. 

“Agility requires more than just having a few roles, artifacts and events. It involves actually using each of those things for a specific advantage. Most organizations seem to be going through the motions rather than understanding what truly drives effective, sustainable agility,” said Bob Hartman, founder of Agile for All, an Agile and Scrum consulting firm. “Agile is hard to do well because it requires changing the way we think and the way we do things. We do all that for a reason: to build better products or get better results. If we don’t understand how Agile principles relate to the final results then we will be stuck doing the practices and basically having the old way of work done in short increments.”

The problem is that Agile isn’t something you do, it is something you are. It’s a mindset, according to Steve Denning, an Agile thought leader and author of the book The Age of Agile. “It is a shift in mindset from a top-down bureaucratic hierarchical approach to a very different way of thinking about and acting in organizations,” he said. “If you have don’t have the Agile mindset you are going to get it wrong.”

Denning explained that there are three components of an Agile mindset:

  1. An obsession with delivering value to customers
  2. Descaling work into small items that can be handled by self-organizing teams
  3. Joining those teams together in a network that is interactive and can flow information horizontally as easily as up and down

Most organizations that say they are Agile usually only possess the second characteristic through Scrum, Kanban, or DevOps, but they miss the obsession with adding value and the network component. Without all three components, it is very likely that the organization will revert back to bureaucratic hierarchical top down practices, according to Denning.

“You can’t scale mindsets. You either have them or you don’t, but obviously everyone in the organization has to have the same mindset. If you don’t have people in the organization with that mindset you are going to run into massive conflicts,” said Denning. 

Scaling Agile to the enterprise
Most organizations are still in the early stages of scaling Agile. “A decade ago many enterprises thought Agile was only for small teams doing small projects. Now it is being recognized that larger projects can be done using Agile and the results will be far superior to what was done in the past,” said Agile for All’s Hartman. 

Steve Elliot, head of Jira Align at Atlassian, used the analogy of crawling, sitting, walking, running and flying when talking about the evolution of Agile within a large organization. 

“When you really get to where you are running, flying and innovating more like a startup does is where you are really driving heavy outcomes and you are not letting the burearchy and red tape that comes with a large organization get in your way as much as it typically does,” Elliot said. “The reason this is still such a hot topic is we haven’t completely solved it yet, especially at the enterprise level.” 

He explained scaling Agile really means getting to a place where functions are repeatable, predictable and measurable at the team level and then bringing it in to the rest of the business. 

“The market and technology moves too fast to work in the old way and so [the business] knows they need to learn a new way of working. I think they are all kind of watching other industry leaders and figuring out how fast they need to get there,” said Elliot

One of the biggest barriers keeping organizations from running or flying is just being able to align the entire organization. 

“This concept of taking Agile principles and applying them across thousands or even hundreds of thousands of people is just not an easy thing to do quickly. It really is just a transformational challenge to get that many people on the same page,” said Elliot.

According to Elliot, the ones who are the best at being Agile are the ones who have executive alignment. The executives across different business units, portfolios and teams are all aligned and trying to solve one problem. They are willing to stick their necks out and work on the problem directly and be owners of the transformation, he explained. 

But getting everyone in the organization on the same page and the same way of looking at things can take up to five to 10 years in medium-to-large organizations, according to Denning, which can deter Agile initiatives. 

“Things can start very small, but they can have fantastic success. It just takes time. It is not going to happen overnight, but it does need courage and a deep understanding of the change involved,” he said. 

Both Denning and Elliot agree that executive buy-in is one of the biggest challenges when it comes to a new way of working, but as there are more Agile successes in the industry and executives are pressured to compete in the market, that challenge should go away with time, they said. 

In addition to alignment, Flint Brenton, CEO of the enterprise Agile solution provider CollabNet VersionOne, explained communication and collaboration are key, and can help with the overall alignment. Once the overall leadership alignment is in place, Brenton explained a number of things have to happen: You have to commit as a company that you are going to change the way you build software and understand it takes time; you have to change the culture and empower teams; and you have to select the platform to support Agile. 

“Large enterprises will take some time to scale horizontally across the organization to get teams straightened out and understand how to get predictable in a more Agile fashion as opposed to the traditional planning cycle,” said Elliot. “If we are trying to measure results more ruthlessly and frequently, it ends up being an organization change, culture change and how you think about business in general, even all the way to funding. It just takes time.” 

Don’t let scaled Agile mislead you
Despite all the conversations to scale Agile, Denning explained the word scaling puts businesses on the wrong track. Agile is about descaling things, and finding ways to simplify things into the smallest possible component, then connecting them together in a network, he said. 

According to Hartman, instead of thinking about scaling, organizations should think about breaking projects down into smaller teams, understanding what is truly valuable and what can wait, and limit the amount of work in progress. “One project at a time is optimal for a lot of reasons. This is a huge learning curve for most organization. They want a lot of things in process, which slows it all down,” he explained.

“Organizations need to first recognize that scaling is not a requirement for success. Second they need to recognize that however they start with scaling, it needs to be inspected and adapted as they learn,” added Agile for All’s Hartman. “When they have projects that truly need scaling, the next step is understanding the need for small empowered teams that work together toward a common goal. Scaling agility requires agility — at all levels of the organization. It can’t be something where a bunch of teams work together and nothing else changes.

Additionally, organizations need to recognize and understand how scheduling, planning and leading all change. “Tradeoffs need to be made and they need to be made with agility in mind. When we focus on being more Agile, we tend to get results. It feels more risky, but in the end it is actually less risky than all the assumptions we put in place that make us feel better about most projects,” said Hartman. “There have been many studies done on Agile and the results achieved. When small, empowered teams work together toward a common goal the results are amazing. When small teams are simply told what to do and there is no flexibility in what is created then the results don’t tend to be much better than prior to the agile implementation. In fact, things can sometimes get worse as people simply think they are in more meetings that serve no purpose.”

Atlassian’s Elliot predicts in five years, Agile will be the safe choice among enterprises. “It is proven to be more effective. Hanging back with the old method is actually going to be the way where you are at more risk because you look like a dinosaur and you are not getting results.”

It’s also important to be careful of the tools you implement in a large Agile initiative. A lot of times, organizations turn to a scaling Agile framework like SAFe, LeSS or DaD to help;  however Denning believes just like the term scaling, these frameworks lead people in the wrong direction. 

SAFe, for example, “is all about an internal system running in a top-down fashion and trying to create a hierarchy in a bureaucracy within which Agile teams can function, but once you have them locked in these compartments, you are not talking about Agile at all,” he explained. 

Hartman is also wary about recommending a framework because too often organizations look at them as silver bullet. Instead, Hartman explained a good starting point is to understand issues and choose strategic, tactics and practices based on needs. 

Elliot agreed that frameworks can be tricky because people tend to live and die by the framework and start to forget about why they are doing it in the first place. But he also explained without a framework it is hard to have a common language across the organization and understand what or what isn’t working. “Organizations don’t have to use an off-the-shelf framework. It can be a framework that the organization designs. But you need some level of structure and process to Agile. I don’t think you can do it without [a framework].. There has to be some way above the teams to look across different products and look at what is happening with customers in a uniform away,” he said.

However CollabNet’s Brenton believes a framework like SAFe can help provide a good pathway to Agile success. Organizations just need to have a conversation about how stringent and strict they want to be around the SAFe principles. For instance, organizations will take the SAFe principles and decide to only focus on a couple in order to get a baseline and then continue to work at taking Agile and SAFe to the next level. “You have to do a self-assessment and determine what is your capability to make those changes and basically ramp the change,” said Brenton. “You have to look at your own company and determine what is your ability to execute and then you adjust your expectations and your deployment model accordingly and that is the best way to ensure success.”

The post Don’t do Agile, be Agile appeared first on SD Times.

]]>
Moving to the cloud https://sdtimes.com/cloud/moving-to-the-cloud/ Tue, 06 Aug 2019 13:43:43 +0000 https://sdtimes.com/?p=36508 The winds are shifting in the industry, and enterprises are clear for takeoff to the cloud. It’s no longer a question of should you move to the cloud. Nowadays, you need a good reason not to be in the cloud, according to Ken Corless, a principal with Deloitte Consulting in its cloud practice. In a … continue reading

The post Moving to the cloud appeared first on SD Times.

]]>
The winds are shifting in the industry, and enterprises are clear for takeoff to the cloud.

It’s no longer a question of should you move to the cloud. Nowadays, you need a good reason not to be in the cloud, according to Ken Corless, a principal with Deloitte Consulting in its cloud practice.

In a recent IDC worldwide public cloud services spending guide, the research firm predicts public cloud spending will grow from $229 billion in 2019 to about $500 billion in 2023. RightScale also found in its 2019 State of the Cloud survey that enterprises are prioritizing public cloud with it being a top priority for 31 respondents and companies planning to spend 24 percent more on public cloud this year compared to last year. 

RELATED CONTENT:
A Cloud Smart Strategy
A sobering look at cloud
Understanding the meaning of cloud-native applications and development

“Adoption of public cloud services continues to grow rapidly as enterprises, especially in professional services, telecommunications and retail, continue to shift from traditional application software to software-as-a-service and from traditional infrastructure to infrastructure-as-a-service to empower customer experience and operational-led digital transformation initiatives,” said Eileen Smith, program director of customer insights and analysis at the IDC.

The RightScale survey also found that 94 percent of respondents are using the cloud, 91 percent are in the public cloud, and private cloud adoption is at 72 percent. Additionally, 63 percent of adopters are at intermediate to advanced stages in their cloud journeys. Sixteen percent are beginning their first projects and another 13 percent are planning their first projects. Only 8 percent of respondents revealed they had no plans to move to the cloud. 

“Organizations see the value and understand the value of the cloud,” said Abby Kearns, executive director of the Cloud Foundry Foundation (CFF). “They are no longer moving to the cloud to check a box because that is what the boss wants. It is more of thinking about the cloud more holistically and around the larger transformation journey.”

The new driving factors behind the cloud
As cloud technology has matured and the benefits have been realized, the reasons for moving to the cloud and how to get to the cloud have changed.

According to Corless, a couple of years ago cost would have been the number one driver to moving to the cloud, but today agility and responsiveness are some of the top drivers. For instance, end users expect companies to respond and deliver faster, and the cloud helps facilitate that agility. 

“What customers find is that they can just get things done more quickly and be more agile because they no longer have to worry about their own infrastructure anymore. Instead, they can focus on getting the applications, getting the analytics and other functionality they want up and running very quickly in cloud environments,” said Dominic Sartorio, SVP of products and development for Protegrity, enterprise and cloud data security software provider.

Cost, however, will still be a big factor for enterprises moving to the cloud because it has the potential to dramatically reduce overall cost, according to Tej Redkar, chief product officer at LogicMonitor, a SaaS-based performance monitoring platform. “If you look at how much savings a cloud can add especially when you look at the infrastructure, hardware, and software, you need a lot of resources to maintain and manage all those things, and all that is being taken care of when you are in the cloud. Ultimately, it leads to cost savings whether that’s labor costs, licensing costs, or general performance, which translate back into costs,” he said.

Whether it is cost, agility or something else, one thing that is certain is moving to the cloud isn’t really about the cloud at all today, according to Deloitte’s Corless. Trends like DevOps, Agile, microservices, machine learning and automation are becoming imperative to the way an organization works and develops, and they are using cloud as a catalyst to break traditional workflows and transform the company. 

“The cloud truly makes you a digital company,” added Redkar. “It helps you transform digitally pretty quickly and helps with the global reach of your solutions.”

Starting the journey
Understanding what is driving your organization to move to the cloud is really going to spark your planning for takeoff, according to CFF’s Kearns. “Is it improving your ROI, improving velocity, decreasing costs, or deploying app workloads in a variety of different regions? Think about what you are trying to gain and then be clear with your teams about what those expectations are,” she explained.  

According to LogicMonitor’s Redkar, there are three adoption phases to moving to the cloud: First, any new applications you are building should be built in the cloud. Then, you should move bespoke apps into the cloud and see how they run compared to those on-premises. The third phase is transforming those bespoke applications from an infrastructure service to more of a platform-as-a-service. “For complex enterprises, it is going to take time because there are thousands of apps within the enterprise, and to cover all of those, their dependencies, their data and their dependencies on the network is going to take time,” Redkar said. 

Because of the number of services and solutions enterprises will have to move to the cloud, even the most aggressive adopters of the public cloud will continue to have a hybrid cloud strategy for years until they can completely move their footprint out to the cloud, according to Deloitte’s Corless. 

Rackware’s co-founder and chief architect Todd Matters suggests organizations start with a pilot. “Almost everyone ends up overspending in the cloud, so picking a relatively small pilot could help understand how to use the cloud and how billing works.” 

Corless also added that companies should start with things that are less mission-critical because “migration is going to be bumpy and you would rather learn and make mistakes on things that aren’t going to be that detrimental to the business.”

According to RightScale’s cloud survey, cloud cost optimization is the number one priority enterprises have this year. As cloud adoption begins to grow, enterprises are struggling with how to manage their cloud spending and optimize cloud strategies to reduce costs wherever possible, the report finds. The report revealed cloud users underestimated the amount of wasted cloud spend, with respondents citing 27 percent of waste in 2019. Despite an increased interest in cloud cost management, only a few companies actually implement automated policies to address issues. 

Getting serious early on about deploying software like auto-scaling will help manage costs because it allows cloud adopters to dynamically increase and decrease workloads as needed, Matters explained. Monitoring software also can reduce costs by detecting underutilized servers or over-provisioned servers that have too much CPU, memory or storage.

A multi-cloud, multi-platform approach
Most enterprises are also taking a multi-cloud strategy when it comes to adoption. According to RightScale’s cloud report, 84 percent of enterprises are taking a multi-cloud approach. 

“You don’t want to put all your sensitive information and applications into the hands of one cloud company,” said Rackware’s CEO and co-founder Sash Sunkara. “It is important to be able to easily move from one cloud to another because the costs of one cloud can significantly change or the reliability can significantly change. We have heard of outages at Amazon, outages at Google. You want to make sure that you can easily move so that no matter what cloud goes away or what datacenter goes down, you are going to be fine and those applications and sensitive data are going to be protected,” she said.

Multi-cloud is going to continue to be an important part of a cloud adoption plan because apps and services and business goals change over time, so cloud adopters need to be able to adapt and adopt a strategy that allows them to move to different clouds based on their current needs, Kearns added. 

In addition to multi-cloud, enterprises are also adopting a multi-platform strategy. According to a recent report from the Cloud Foundry Foundation, 48 percent of responders are embracing multi-platform with a combination of PaaS, containers and serverless. The report also finds containers are expected to become mainstream within the next year, with 34 percent of respondents using 100 or more containers. According to the CFF’s Kearns, containers usage is rising because it actually helps enterprises along their cloud journeys. “Think of containers as the building blocks to organizations that want to take advantage of these cloud architectures. Containers are still at the very heart of that. They are what allows portability and a constant environment,” Kearns said. 

Deloitte’s Corless explained that containers also help you balance some of the fear against vendor lock-in due to the ease of moving containers and migrating their workloads.

The road can be bumpy
Even though there is a lot of interest in the cloud and a lot of enterprises claiming to be executing a cloud strategy, there are still a number of things that can go wrong. 

One of the first steps to moving to the cloud is moving business applications such as third-party apps enterprises traditionally have hosted themselves, like Office 365, SharePoint, Slack, Box and Microsoft Exchange, according to Deloitte’s Corless. So while it may seem like most enterprises have moved to the cloud, they may have only moved to these types of SaaS solutions, which tend to be less disruptive and invasive cloud adoption strategies, Corless explained. 

This is just a stepping stone to a broader cloud adoption, according to Corless. 

The hard part is when you take that to the next level and start moving bespoke applications or applications enterprises have built themselves, such as consumer-facing apps, to the cloud, according to LogicMonitor’s Redkar. “The complexity of those existing workloads becomes a challenge because you can’t just lift and shift an application to the cloud,” he explained. “You need to tweak it for the cloud, and that tweaking takes time and comes with its own complexities.”

Enterprises have to evaluate which workloads can be moved to the cloud, and which ones should remain on-premises. They also need to consider the data itself. According to Redkar, if enterprises are hosting their own data and have been for years, whether in a mainframe or data warehouse, that is going to be extremely difficult to move to the cloud. That is why it is  important to hire the right people with the right skills to help facilitate a cloud adoption. 

“Once you have the right people with the right skill sets in place, then they will make the right decisions on how to run and move apps to the cloud,” said Redkar. 

However, that can be a challenge in itself because a culture change also needs to take place to successfully move to the cloud, according to CFF’s Kearns. “The technology part is the easy part. Changing your culture, changing your business, changing the way your business works and communicates is the larger challenge. It isn’t just a matter of putting some apps in the cloud, it is really a matter of building that velocity in your organization,” she said. 

Developers, operators, line of business, product owners and security experts all need to be put into a single team with a single goal in mind around not just the application development, but iteration and ideation, according to Kearns. “The hardest thing to do is build those cross-functional teams and getting that collaboration when you need it,” she said. 

A cloud culture is one that thrives on change and innovation, according to David Linthicum, managing director and chief cloud strategy officer for Deloitte Consulting. And then in order to make sure you have the right skills in place, enterprises should do a skills gap assessment, decide what technology they are going to be using and what skills they have on staff. This will help assess whether an organization needs to hire or train new employees, Linthicum explained. 

While cost is a driving factor for companies to move to the cloud, it’s not always cheaper than running on-premises, according to Kearns. Having the right skills within the organization can help evaluate which workloads go where and how to continuously evaluate those efforts, she explained. 

Ensuring a smooth transition 
One major hurdle that cloud adopters continue to deal with is security. There is an ongoing debate in the industry on whether it is more secure to stay on-premises than move to the cloud. 

Deloitte’s Corless argued it is a risk either way. “I believe good world-class security is easier in the public cloud than it is on-premises. I also believe big gaping security issues are also easier on the public cloud.”

According to LogicMonitor’s Redkar, the cloud itself isn’t less secure; the biggest security issue when it comes to the cloud is the people who are managing cloud environments are not aware of the exposure in the cloud. 

Symantec’s 2019 Cloud Security Threat Report revealed while organizations fear moving to the cloud could result in a larger risk of data breaches, it is actually immature data practices. “Data breaches can have a clear impact on enterprises’ bottom line, and security teams are desperate to prevent them. However, our 2019 CSTR shows it’s not the underlying cloud technology that has exacerbated the data breach problem – it’s the immature security practices, overtaxed IT staff and risky end-user behavior surrounding cloud adoption,” said Nico Popp, senior vice president, cloud and information protection for Symantec.

While tools can help organizations design and implement policies as well as apply protection methods like encryption algorithms, tokenization, and anonymization, security is actually more about people and processes than about technology, according to LogicMonitor’s Redkar. “Because most of the data used to reside in the enterprise datacenter, there were not many ways to get in. With the cloud, there are different ways to get in, different credentials which if they leak can lead to different levels of security hacks, so making sure developers and engineers understand and embrace the secure lifecycle of the infrastructure is important,” he said. 

According to Protegrity’s Sartorio, cloud vendors publish a shared responsibility model, which explained what layers of the stack the cloud vendor is going to cover such as the physical protection of their infrastructure, hardware or network. “What they don’t cover, and they are very open about this, is their customers’ own application workload and the customers’ own data. They say you as the customer are responsible for making sure whatever data and whatever applications you are standing up in these cloud environments are protected.”

The reason cloud vendors don’t protect customer data is because they don’t have any way of knowing what data is coming in and out of their environments — or even if they can tell it is customer data. Further, they don’t know how sensitive it is or what regulator region it is in, Sartorio explained.

In order to properly protect data, organizations need to have a set of policies in place, be able to control who sees the data, and take stock of the data that is already there. In addition, implementing a discovery tool can detect if there is any sensitive customer data out in the open that shouldn’t be. 

“The biggest problem is ignorance. You hear about all these embarrassing breaches all the time where someone put something in Amazon S3 that is just out there for people to see because they were ignorant about cloud security,” said Sartorio. 

If you are not confident about the processes you have in place or your organization’s ability to control data in the cloud, you should stay on-premises for now and figure out how to make it work, according to LogicMonitor’s Redkar. “Eventually, you are going to need the cloud to help you expand your business more rapidly than your datacenter can, unless there is a regulatory need to say on-premises,” he said. “If you want to compete in the marketplace then you need to adopt the cloud.” 

The post Moving to the cloud appeared first on SD Times.

]]>
Reskilling developers for the new software landscape https://sdtimes.com/softwaredev/reskilling-developers-for-the-new-software-landscape/ Tue, 02 Jul 2019 17:03:40 +0000 https://sdtimes.com/?p=36094 Software changes fast, and developers will need to vigilantly reskill their workers to maintain competence in the highly competitive arena. Reskilling includes learning new programming languages, containerization, big data and working with the most significant tech disruptor: automation. “There’s a growing awareness that the half-life of any technology skill is about two to three years. … continue reading

The post Reskilling developers for the new software landscape appeared first on SD Times.

]]>
Software changes fast, and developers will need to vigilantly reskill their workers to maintain competence in the highly competitive arena. Reskilling includes learning new programming languages, containerization, big data and working with the most significant tech disruptor: automation.

“There’s a growing awareness that the half-life of any technology skill is about two to three years. Even if you are a skilled developer, [if] you don’t reskill and learn more, it’s hard to stay relevant,” said Leah Belsky, vice president of enterprise at Coursera, a massive open online course (MOOC).

RELATED CONTENT: Bootcamps and MOOCs are picking up STEAM

Luckily, there are many ways to do so. Bootcamps, college courses, conference training, in-house consulting and courses are just some ways to go about reskilling the workforce.

Some of these educational methods can tailor specifically to companies and their goals and reskill their workforce accordingly.

“Through a massive consumer database we have insight into what skills are trending and what competitors of companies and specific industries are investing in and how their employees are learning and performing,” Belsky said. “We’re now working on options for companies to actually offer their own content.”

The speed of reskilling is one of the top priorities for companies, which will need to look toward implementing training programs that can more quickly retool the labor force rather than toward multi-year degrees, according to the most recent McKinsey Global Institute report on the state of jobs: “Jobs lost, jobs gained: Workforce transitions in a time of automation.”

Software development is still the most in-demand position for tech giants like Amazon, Google, Apple, Microsoft and Oracle, followed by the demand for data analysts, and the demand for these skills is only projected to grow in the near and far future, according to a report by personal career advisor Paysa. Basic knowledge in mobile development is also beneficial as companies are becoming more mobile-focused and need developers to build apps and mobile operating systems.

Despite the notion that automation takeover is not nearly as prevalent in software development as in other fields, developers still need to take advantage of sharpening their skill sets to remain competitive since software changes fast.

“We’re seeing that automation is now a must-have job skill,” said Ken Goetz, Red Hat’s VP of Global Education Services. “We see what’s happening in the larger industry where improving efficiency to enable the move to go up in the hybrid cloud is pushing customers towards needing to have more automation in their infrastructure.”

Ironically, while automation and AI are replacing specific tasks that were always manually driven, they also drive a tremendous demand for employees that have the skills to work with that automation.

“Artificial intelligence is meant to complement and enhance rather than displace human labor,” said Costas Spanos, director of the Center for Information Technology Research in the Interest of Society (CITRIS) and the Banatao Institute, which creates information technology solutions for society’s most pressing tech challenges between the UC campuses.

Some businesses are taking the lead in some areas, providing on-the-job training and opportunities for workers to upgrade their skills, both through in-house training and partnerships with education providers.

One software company that’s reskilling its administrators in-house is Red Hat. Finding that 20 percent of Linux job postings require automation as a core skill was part of the impetus for creating the in-house training program, according to Goetz.

The company announced that it was overhauling its Red Hat Certified Engineer (RHCE) certification to train Linux professionals. The new program is built around acquiring automation skills, mainly using Red Hat Ansible Automation, because it has become such a vital tool for Linux system administrators.

The new course will teach admins how to automate tasks such as provisioning, configuration, application deployment and orchestration in addition to teaching the core Linux skills. They can then install and configure Ansible as well as learn how to prepare managed hosts for automation.

The size, volumes and the efficiencies needed to run a modern data center or to run a modern cloud — whether it’s a public or private cloud — have grown to levels that only automated software can handle.

In the past, developers would have to write shell scripts to program that automation to do a specific set of tasks. This laborious task was a huge problem standing in the way of productivity.

“The problem with that is that you’re basically reinventing the wheel,” said Red Hat’s Goetz. “You’re not taking advantage of where the industry is and the work that others have done and so every time someone goes in and builds to the script they’re building it on their own, and they’re maintaining it on their own. It’s just not scalable.”

To solve that burdensome loop, DevOps teams are increasingly adding automation tools like Ansible to end repetitive tasks, speed productivity and scale all in an easily readable language. Puppet and Chef are similar automation solutions for managing infrastructure and applications.

“Every business right now is trying to think about how they can automate more of what they do and so having a baseline of skill around automation enables these other things to occur in the infrastructure to be able to support that next generation of technologies,” said Goetz.

In addition to automation, the growth of mobile and smart devices and the spread of the IoT have made connected devices commonplace, creating vast datasets that need to be stored and secured and heightening the demand for data science skills such as machine learning and statistical programming, according to the Coursera Global Skills Index 2019.

“People who have the ability to use the latest tools to analyze big data sets are in demand now,” said CITRIS’ Spanos. “Just like using Word, PowerPoint and Excel became a generic skill that everybody needs to know regardless of what they do, data analytics is going to be similarly pervasive. Data science at UC Berkeley is emerging as a huge new movement that is transforming the way we train everybody and not just engineers.”

Currently, there is a significant deficit of people with the skills to work with the vast troves of data, relegating the untapped information to a category called “dark data.”

A recent report by Splunk, a provider of software for searching, monitoring, and analyzing machine-generated big data, via a Web-style interface, found that over 60 percent of organizations believe that more than half of their data is dark, while a third of them believe that over 75 percent of their data is dark.

“I think a lot of the technological problems of data processing are increasingly solved by commercial tools and open source and so it’s not really the processing of the data so much as the understanding of what the data represents and what are some of the biggest barriers to access and to be able to access our data,” said Tim Tully, the CTO of Splunk. “I think a lot of it is it’s just people are collecting data from devices and laptops and servers in the enterprise and are not doing anything with it.”

The top challenges to recovering dark data include the ever-growing data volume and a lack of necessary skill sets and resources, the report found. Ninety-two percent of employees said that they are willing to learn new data skills. Surprisingly, 69 percent responded that they want to continue doing what they’re already doing, regardless of the impact on the business or their career.

“Data wrangling is hard. Disks are getting larger and larger in terms of capacity. What that’s doing is sort of exacerbating the problem with dark data because now we’re getting more granular with the data we collect. Most people consume the data in a dashboard but they don’t actually know where the data came from,” Tully said. “I think the important thing is to make it a priority to understand that this concept of dark data exists and there’s potential data that you can do something with. The other thing is making sense of the data that was collected.”

In addition to automation and dark data, the importance of new programming languages is a cause for reskilling. As the programming industry evolves, programming languages do too.

According to writer Arani Chatterjee in an article on simpleprogrammer.com called “4 Reasons Why Your Choice of Programming Language Doesn’t Matter Much,” some of the popular programming languages like Perl and Ruby are falling out of favor and new programming languages like Swift, Kotlin and R are gaining popularity.

Developers are always creating new programming languages (e.g., R), frameworks and tools to make development better and faster.

A recent ActiveState survey titled “2019 Developer Survey: Open Source Runtimes” also found that open-source languages get varying degrees of satisfaction from developers with Python being most satisfying and SQL being the most popular for daily use. Eighty percent of respondents use SQL the most, followed by Javascript at 77 percent and Python at 72 percent.

Tech workers in high demand

The rapid evolution of software has created a massive demand for technical workers to work as developers and IT professionals in companies that span all industries.

A 2017 Singlesprout report projected a shortage of 1.1 million STEM workers by 2024. Traditionally, secondary schools and colleges carried the onus of preparing workers for the jobs, but they have been unable to do so quick enough in today’s rapidly changing tech economy.

According to Coursera’s 2019 Global Skills Index, the United States ranks 23rd in the world when it comes to technology. It lags far behind countries that are leading the category such as Finland, Switzerland and Austria which have heavy institutional investment in education via workforce development and public education initiatives, the report explains.

“I think [government] policy has to play a major role in that as well because the free market is not going to get everything right, especially when it comes to mid to late career workers that need to be reskilled. There is a danger that they may be left out,” suggested CITRIS’ Spanos. “So I think that there should be a policy that should first create the appropriate incentives on the appropriate skills.

Meanwhile, the McKinsey report projected that as many as 375 million workers (around 14 percent of the global workforce) might need to switch occupational categories as digitization, automation, and advances in artificial intelligence disrupt the world of work.

A recent research report by the Society for Human Resource Management states that nearly 40 percent of hiring managers cited a lack of technical skills among the reasons why they can’t fill job openings.

One of the largest companies to recently pull out all the stops for reskilling and arming its employees with the technical skills they needed is AT&T.

Early last year, the mobile giant discovered that half of its 250,000 employees lacked the necessary STEM skills to keep the company in the ring against T-Mobile, Verizon and Sprint. Meanwhile, the switch from hardware to cloud-first operations at the company would render the work of 100,000 employees obsolete.

As a response, AT&T invested a billion dollars in retraining its employees. Projections showed that hiring new software and engineering workers would have cost the company significantly more.

“It’s often impossible to find the type of talent you need in the market and the cost of hiring and training can be upwards of 50, 60, 70 thousand dollars and sometimes much more for each new hire,” Coursera’s Belsky said.

Moreover, while high employee turnover creates a fiscal black hole, there are more reasons to avoid escorting employees to the door. Employees uphold the structure and culture of an institution. They know the lingo of the work they do and they know who is who.

Belsky explained that investing in reskilling can motivate the staff with credentials and certificates that can have a lasting impact on their careers and make them engage in their learning. This investment leaves the company executives with a skilled workforce and employees who feel fulfilled: the best of both worlds.

The post Reskilling developers for the new software landscape appeared first on SD Times.

]]>
GDPR one year later: Slow compliance, lax enforcement https://sdtimes.com/data/gdpr-one-year-later-slow-compliance-lax-enforcement/ Thu, 23 May 2019 13:00:40 +0000 https://sdtimes.com/?p=35641 It’s been one year since the General Data Protection Regulation (GDPR) went into effect. The regulation completely changes how organizations need to handle the data of European Union citizens. The impact of the GDPR, though, has been minimal to this point. Compliance has been slow, enforcement has been lax, and organizations are finding that learning … continue reading

The post GDPR one year later: Slow compliance, lax enforcement appeared first on SD Times.

]]>
It’s been one year since the General Data Protection Regulation (GDPR) went into effect. The regulation completely changes how organizations need to handle the data of European Union citizens.

The impact of the GDPR, though, has been minimal to this point. Compliance has been slow, enforcement has been lax, and organizations are finding that learning about data origin, residence and use can be hugely daunting and difficult.

RELATED CONTENT: Microsoft wants the US to follow the EU and establish new data privacy laws 

When it first went into effect, there was a lot of panic among organizations that did business in the European Union (EU), because the fines for not complying can be steep. According to Christian Wigand, a spokesman for the European Commission, fines are determined based on a number of factors, such as how the company protected its data, how it reacted to a data breach, and whether it cooperated with the authorities.

According to a report from DLA Piper, as of February, 91 fines have been issued. They noted that a majority of those fines were relatively low in value. One of the major fines is Google, which was fined by France for €50 million Euros, which translates to roughly US$56 million. According to a press release from the European Data Protection Board, Google is being fined “for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization.

The regulation lists two different tiers of fines. For less severe violations, a company can be fined up to €10 million or up to two percent of their revenue from the previous financial year. More severe violations can cost a company up to €20 million or up to four percent of their revenue from the previous financial year.

According to DLA Piper’s report, the lower fines are applied in response to the breach of obligations to the controllers and processors (including breach notification obligations), certification bodies, and monitoring bodies as described in the regulation. The higher fine is applied when there is a breach of the “basic principles for processing including conditions for consent, data subjects’ rights, international transfer restrictions, any obligations imposed by Member State law for special cases such as processing employee data, [and] certain orders of a supervisory authority.”

According to Wigand, the fines that are collected will go back into the country’s budget. How the money is used will be determined by the individual country.

The enforcement of the GDPR is the responsibility of national data protection authorities, together forming the European Data Protection Board (EDPB), Wigand explained.

“The only company that I’ve seen that’s a big news story as far as GDPR enforcement was that fine that they imposed on Google,” said Matt Hayes, vice president of SAP Business at Attunity, a data management company. “And I believe Google is challenging it, but I haven’t seen too much enforcement of GDPR outside of that.”

According to Hayes, there is a lot riding on the outcome of Google’s case if they try to fight the fine. “If Google can fight it and win it, that is problematic. If Google doesn’t win it, then I think it’s something that a lot of companies will notice.”

Enforcement leads to compliance
Hayes expects enforcement to pick up soon, if the EU hopes to ensure compliance. If enforcement remains as lax as it is, companies will continue only loosely complying with the law, he explained.

For example, Attunity provides a product that deals with the right to be forgotten. Hayes believes that some of their customers have bought that product but haven’t implemented it. “We’ve seen some companies say just by owning [Attunity’s] software we can demonstrate some level of compliance. So they can actually own our software and not implement it.”

Hayes believes we’ll continue to see this slow compliance, unless there is a concrete reason for companies to really speed things up. “I think we’re going to see companies say, look we’ve taken a few baby steps towards GDPR, but the minute that they find out that they’re going to be audited, or the minute that there’s some enforcement in their industry, they might then decide to tighten the screws up a bit.”

According to Gary LaFever, CEO of Anonos, different countries have been taking different approaches to audits. The law is being enforced on a country-by-country basis, rather than a single entity doing it for the EU as a whole.

For example, in Italy they are doing audits in conjunction with the tax collectors. “They’re going in and they’re just doing random audits of people to see if they’re in compliance with the GDPR,” said LaFever.

But despite the slow enforcement of the law, many organizations have made changes to their data practices, which has led to structural, technical, and cultural changes within organizations.

“From an organizational perspective, I’ve seen a lot more today of teamwork at companies,” said Scott Giordano, a privacy attorney, IAPP Fellow of Information Privacy, and vice president of data protection at Spirion, a data security software provider. “Think about legal and compliance, risk management, HR, internal audits —  they’re all now at the table, whereas previously it really was confined to IT and IT security.”

Companies now have to be aware of where all of their data is stored, which is something that a lot of companies struggled with, and still continue to struggle with, explained Tim Woods, vice president of technology alliances at FireMon, a security company.

John Pocknell, senior solutions product marketing manager at Quest Software, explained that the person to do that identification is a database administrator (DBA). Having someone in place to oversee data isn’t a requirement of the GDPR, although having data protection in place is. Now, Pocknell is seeing U.S. companies start to put people in place whose main responsibility is to protect the company’s data. “So even though it’s not a requirement, we are beginning to see DBAs become more big in that sort of data protection role.”

As a result of the GDPR, companies have put more of a focus on  accountability when it comes to their data, said Jean-Michel Franco, senior director of product marketing at Talend, a data integration company. There should be someone at every company who is held accountable for how the company is complying with the regulation. “I think this was the most important change and the most impactful change,” said Franco. “Until the company had the DPO (Data Protection Officer), GDPR remained something a little abstract and something a little boring, and a regulation that they didn’t care much about. Once a DPO has been nominated, it changes the way that the enterprise proceeded.”

Companies need to decide whether they should hire a data protection officer, explained FireMon’s Woods. If a company suffers a breach, and they’re found guilty of not doing their due diligence —  things like not having a data protection officer or not running internal assessments — then the fines could be much higher. “You have to provide them a way to be forgotten, you have to author a right of erasure or elimination or the right to be forgotten, once you have my information and if you don’t provide that and a breach happens, then you’re going to be found at fault,” said FireMon’s Woods.

“I think as corporations look at this, it’s going to have them questioning what they are doing from a response perspective,” said Woods. Companies will have to start asking questions like: “‘What am I doing? Are we running data impact assessments? If we don’t have a data protection officer, should we get one now? What are we going to do when a breach occurs? —  Not if, but when the breach occurs — Not understanding the significance or how deep that breach may be, but what’s going to be our posture in response to that when we have to notify the DPA, the Data Protection Authority, within our 72-hour quote unquote period? How are we going to be perceived as a company from a readiness position?”

Quest’s Pocknell believes that a lot of companies were not prepared for the regulation. Companies may have started on the path, but in light of reports of recent big data breaches from companies outside of the EU, it’s a wake-up call that this is something everyone should be addressing.

Wigand believes that companies were given sufficient time to prepare for the regulation prior to its implementation in May 2018. The GDPR was adopted in December 2015, and guidelines on how to apply the new rules were published by the commission in January 2018.

The GDPR has been a double-edged sword for companies
Overall, the GDPR has led to positive effects for consumers, and negative (and some inadvertent positive) effects for companies.

“At HubSpot, we believe that GDPR is a good thing for the sales and marketing industry,” said Jon Dick, vice president of marketing. “It puts our industry down a path that we believe in deeply, a path of being customer first, of abandoning spammy tactics, and of being more inbound. Today to be successful, it’s not just about what you sell, it’s about how you sell. The companies that are doing this well provide an exceptional experience and are transparent about how they’re using your data.”

And while overall the GDPR has been good for consumers, most consumers still don’t know much about the GDPR. According to research from HubSpot, only 48.4 percent of consumers in the EU were familiar with the regulation and 63.9 percent of consumers in the UK were familiar. They also found that EU consumers are less familiar with the GDPR this year than they were in 2018 (32.9 percent in 2018 vs. 26.3 percent in 2019).

From the perspective of organizations, there are a few potential negatives to non-compliance. In addition to the financial penalty of not complying, there are also business risks if something were to happen to data that you weren’t protecting properly, Quest’s Pocknell explained. “You wouldn’t want to be that company that bestows personal data into the public domain. Yes, you’re going to get a fine, but imagine how much business you’re going to lose,” he said.

Another negative is that a lot of companies have lost a lot of their contacts, Talend’s Franco said. Companies that didn’t have consent to the data struggled to get consent when they followed up with those users as the law went into effect. On the flip side, some companies used this as an opportunity to connect with their customers and establish trust by being transparent about why they wanted the data and how it would be used, Franco explained.

Gary LaFever, CEO of Anonos, a data company, believes that this data loss could be catastrophic. He mentioned a study by IDC that found that a top five hospitality firm had deleted two decades of customer data. “Now they have no means of tracking historically how new initiatives that they have compared against what they’re done in the past,” said LaFever. “So particularly for AI and analytics that includes a baseline that includes what’s happening today and what’s been done in the past, you lose access to that data.”

LaFever notes that this doesn’t really apply for highly regulated industries that likely are required to maintain a separate copy of their data for audits or inquiries.

But beyond the regulation itself, the process of preparing for it has had a positive impact on organizations.  By taking a deep look at their data practices, companies have been able to make better use of their data, and develop better practices for handling their data — even beyond the requirements of the GDPR.

According to Giordano, the GDPR has forced organizations to look at every element of the life cycle of personal data, including “how you’re collecting it, how you’re using it, how you’re sharing it— perhaps most importantly — and then disposing of it,” he said.

For example, Talend’s Franco has worked with a company that now has a better understanding of where all of their data is, so they are able to leverage it and use it for analysis. “They took the opportunity to get control over their data.”

Franco notes that he believes this to be true only for large corporations. There are a lot of companies out there that only did the bare minimum to be legally in compliance, he explained.

FireMon’s Woods believes that companies have begun to reassess how ready they are to respond to a breach. He believes that this is more attributed to the rise of breaches in general, rather than a direct result of the GDPR, though the GDPR does put more pressure on companies to respond to breaches faster.  According to DLA Piper’s report, 59,000 personal data breaches have been brought to the attention of regulators since the law went into effect, as of February 2019. Those breaches range from minor ones, such as an email accidentally being sent to the wrong recipient, to major attacks that affect millions of people.

The GDPR gives companies 72 hours from discovery of a data breach to response. According to Woods, those 72 hours are like a second when you have to identify how you’ve been breached, the scope of the breach, such as how many people were affected and how much data has been affected. Not only that, but you have to notify the Data Protection Authority that you’ve been breached. “So 72 hours is not a lot of time to prepare, I think to understand what the extent of a given breach is.”

For example, last year it was revealed that Marriott had a breach that goes back to 2014. “I mean how do you assess a breach that goes back that far? How much information has actually been compromised?” said Woods. “So I think probably for the EU, GDPR has had an impact. I think in general for the U.S. and other countries, I think the rise of breaches in general are causing people to better their posture from a breach incident reporting and forensics gathering position.”

GDPR is paving the way for more data regulations
“A lot of the companies outside of the European Union regardless of whether or not they use data that originated from the European Union are taking a look at where they stand,” said Pocknell. The GDPR is just the beginning. Other data regulations are beginning to pop up now as well. The most noteworthy one is the California Consumer Privacy Act, which was announced last year, and which will go into effect in January 2020.

Industry experts expect that other states will soon follow suit with their own regulations. California is paving the way, but more will come soon, FireMon’s Woods believes. “I think all the states are following California’s lead right now on what they’re going to do from a personal privacy perspective to protect users. So no doubt. Everybody’s going to be following that.”

“My guess would be that a year from now we’ll probably see two or three or four other high-profile states passing data protection laws,” said Attunity’s Hayes.

The GDPR doesn’t cover everything
According to Hayes of Attunity, a data management company, the laws are more descriptive than prescriptive. This can cause some headaches for companies who are trying to figure out what they need to do to be compliant.

For example, there’s nothing definitive in the GDPR about personal data on backup tapes, Giordano explained. But if personal data is restored from that backup tape after it had been deleted, organizations will have to go in and make sure that that data gets redeleted. This probably means organizations need to rethink their backup practices.

Another scenario that’s not really addressed by the GDPR is tourism, Giordano explained. “Say that you’ve got an EU person and they’re here on vacation. Are they covered by the regulation?  The GDPR doesn’t touch that and doesn’t really comment on it. Even some post GDPR commentary didn’t do a very good job of talking about it.”

Legitimate use can be used in place of consent
According to LaFever of Anonos, there are six legal bases under the GDPR for capturing data. Consent was the most commonly used one in the past, but the GDPR radically changes what counts as valid consent. “If data was collected using the old-fashioned approach to consent, it would not be legal.”

The GDPR does not have a grandfather clause that would enable companies to keep data that they collected using that old method of consent before the law went into effect.

But, according to LaFever, there is another legal basis called legitimate interest processing. If you can prove that you have the proper technical and organizational controls in place that mitigate the risk of data loss, you can use that data under the legitimate interest basis.

This is stated in Recital 47 of the GDPR, which says: “Such legitimate interest could exist for example where there is a relevant and appropriate relationship between the data subject and the controller in situations such as where the data subject is a client or in the service of the controller.” Recital 47 also states that processing data necessary for the purposes of preventing fraud and processing personal data for direct marketing purposes may be considered legitimate interest.

 

The post GDPR one year later: Slow compliance, lax enforcement appeared first on SD Times.

]]>
With microservices, more isn’t always better https://sdtimes.com/micro/with-microservices-more-isnt-always-better/ Mon, 08 Apr 2019 13:00:56 +0000 https://sdtimes.com/?p=34942 The benefits of microservices are undeniable. Software development companies want to be able to deliver software rapidly, frequently and reliably — and microservices are a means to that end. A recent O’Reilly Media report found that more than 50 percent of software projects are currently using microservices. Of those surveyed, 86 percent have found at … continue reading

The post With microservices, more isn’t always better appeared first on SD Times.

]]>
The benefits of microservices are undeniable. Software development companies want to be able to deliver software rapidly, frequently and reliably — and microservices are a means to that end. A recent O’Reilly Media report found that more than 50 percent of software projects are currently using microservices. Of those surveyed, 86 percent have found at least partial success with microservices while only 15 percent have seen massive success.

The problem is that, like with any new trend, businesses tend to jump on the bandwagon too quickly, according to Michael Hamrah, chief architect at the HR software company Namely. “You think just by having it that you unlock the value proposition, but like anything there is a learning experience,” he said.

RELATED CONTENT:
Securing Microservices: The API gateways, authentication and authorization
Microservices anti-patterns
Microservices gone wrong

A common mistake Hamrah sees happening when businesses move to microservices without a clear understanding or intent is they end up with what some know as Frankenstein microservices, microliths or monoservices, where you end up with an inability to manage services, set service boundaries or provide good reliability.

“Organizations need to assess if microservices are a right fit for them and adapt their organizational structure and culture before embarking on a microservices adventure,” said Carlos Sanchez, principal software engineer at continuous delivery software provider CloudBees. “Microservice architectures can start with good intentions, but can go wrong and end up with Frankenstein microservices or distributed monoliths that manage to get the worst of microservices and monoliths: slow development velocity, more defects and increased complexity.”

However, since microservices are a completely different way of working and there is no framework out there to tell you want to do, it can be hard to tell whether or not you are on the right path before it is too late, Hamrah explained. You need to ask yourself, are you releasing to production and continuing to release to production? “If you are doing that and you are feeling really good about your ability to develop features and work on features then that is the most important thing no matter how you are doing it,” he said. If you are struggling to manage infrastructure you probably need to rethink your architecture.

Hamrah provides four considerations to keep in mind when creating a microservice:

  1. Think about what the boundaries are and the APIs. Hamrah explained a service “must be the definitive source of truth for data and functionality it is intended to cover.”
  2. Microservices must promote loose coupling so operations are simplified and services can evolve independently of another.
  3. Microservices should create opportunities and add value. “You should really be thinking about what new service can you leverage in various ways. What new data can you provide? What new functionality can you enhance through your product?” he asked. “And then going back to your first two principles are you able to do that independently and just focus on that piece?”
  4. Service must be reliable, so it is important to think about uptime and usage, according to Hamrah. This includes properly monitoring it and having observability into the service.

Another reason companies are having trouble successfully implementing microservices is because they get too caught up in the term microservice itself, according to Chris Richardson, microservices consultant and founder of the transactional microservices startup Eventuate. While a common belief around microservices is that they should be small, single-function services, Richard believes that is a nebulous idea and it is better to create cohesive services with functions that belong together. “The big idea of microservices is functional decomposition to accelerate development,” he explained. “If a team, one of many teams developing an application, is in production with a single service then why split it since more services equals more complexity.”

Richardson explained micro tends to imply that things have to be small when really it is an architectural style that should imply loosely coupled services within an application.

Organizations often have the misconception of the more services the better. “This pushes people down the route of creating or defining too many services,” he said. “I always try to tell people to aim for as few services as possible and only if there is a problem should you start to split and introduce more services.” Your services should implement meaningful chunks of business functionality, he added. Richardson’s characteristics for a microservice are that they are easy to maintain, evolve, test and deploy.

If you find yourself in the trough of disillusionment with microservices, Hamrah said it is important not to get discouraged. “There is initial excitement when people go and and maybe they go in too fast, but you learn from that and hopefully you are constantly learning, iterating and improving,” he said.

When the computer software company Nuance recently went through a microservices transformation, the team found it was hard to get the balance right. “What service should be responsible for what and avoiding creating a new monolith in the process by trying to put too much into one place was really hard,” said Tom Coates, senior principal architect for Nuance. “We had a couple false starts that were way too big and too much like the old system, but we kept refining and breaking it up until we got to a place where we were comfortable.”

Hamrah added, “If you can move forward to solve your immediate problem and you have a very healthy culture of refactoring, improving and iterating — I think you are going to work through these common early mistakes and get to a point where you fully understand and adopt patterns of a healthy microservice ecosystem.”

Managing with the complexity
The  benefits of microservices can overshadow an important fact of moving to this architecture: it’s sometimes more complicated moving to microservices because data is distributed, there are many moving parts, and there is a lot more to manage and monitor, according to Eventuate’s Richardson. It is a big change going from having to manage one codebase to multiple and even hundreds of codebases and services; and sometimes it’s not always the right choice.

“Companies jump too early into microservices. I don’t think there is anything wrong with monoliths. There is nothing wrong with starting out early products or even projects in large organizations in a monolith way,” said Hamrah. “You really want to be focused on where you are spending your effort and if you are spending too much effort in dealing with bugs and not being able to release your code because things are too tightly coupled or not being able make safe factor choices, you probably need to move to microservices.”

If your application is not large or complex and you don’t need to deliver it rapidly, then you are probably better off with a monolith, explained Richardson.

“You shouldn’t do microservices until you need them. You shouldn’t use them for small applications, or in small organizations that don’t have any of the problems that microservices are trying to solve,” said CloudBees’ Sanchez.

If you do decide a microservice architecture is the best fit for your organization and teams, Nuance’s Coates said from experience it is best not to try and go in halfway. “It is an all-or-nothing proposition in my opinion,” he said. “Unless you have a system that already has some clearly defined interfaces then maybe you can try to piecemeal it. Our first attempts were, let’s do a little here and there, and that just doesn’t work. Since it is such a fundamental shift, it’s tough to make it play nicely with other legacy systems. If you are going to migrate from a classic architecture to a microservice architecture, you have to more or less greenfield the entire thing and start from ground zero.”

To do this, you need tooling and processes in place to ease the complexity, such as service meshes or  server monitoring and distributed tracing tools, according to Sanchez. “You need an automated pipeline from development to production, as you can’t manually scale the monolith architecture processes to dozens or hundreds of microservices. You also need to take advantage of DevOps and progressive delivery methodologies, like blue-green or canary,” Sanchez added.

Additionally, APM-based tools will help teams get the right observability capabilities in place, according to Namely’s Hamrah. Some tools Hamrah recommended include the open-source framework gRPC for defining services and Istio for monitoring traffic. But when it comes to picking tools and addressing challenges, Hamrah explained teams should be aware of whether they are actually struggling with managing their infrastructure of struggling with a particular technology they are using to accomplish their goals.

Calvin French-Owen, CTO and co-founder of customer data infrastructure company Segment focuses on three different pieces when it comes to building microservices: 1. Making sure they are easy to build in the first place by providing a service template and scaffolding necessary to get users going. 2. Making sure they are easy to deploy and spin up the infrastructure. 3. Making sure it is easy to monitor and understand what is going on in production.

In order to tell if you are actually on the right path and improving, some key metrics you should be looking at are the time it takes for developers to commit or check in a change until that change is deployed into production, and how frequently you are relaying changes to production, Eventuate’s Richardson explained. “Improvement efforts should be centered around improving those metrics. If you are adopting microservices but not seeing those metrics improve, then something isn’t right,” he said.

Lastly, Nuance’s Coates added having architectural guidelines in place can help teams understand what a service should and shouldn’t do. “Each service has a purpose and should be able to stand on its own, describe what it is for, and how someone might use it. No matter what microservice you are looking at, you know your way around it because it is packaged and laid out similar to other ones.”

The post With microservices, more isn’t always better appeared first on SD Times.

]]>