Friday, 18 July 2014

Why do we need Software Testing?

This is an excellent question, and one that regularly gets asked in organisations that have to deliver projects and software. Why can’t the developers just test it? Why can’t the end users test it? Surely anyone can test?

The growth of Software Testing as an industry over the last 20 years is a clear indication of the importance that large and small businesses place on having workable, easy to use software. It is no coincidence that this growth has accelerated as we now use software in everything we do – surfing the internet, in our cars, on our tablets and mobile devices, even typing this blog! So we, as users, should know what good looks like, and what bad looks like…..
We have all had moments when a programme crashes mid-use, data goes missing or when you’re trying to book a holiday and the web site illogically asks you to re-enter all your details again! So, by using these programmes - does this make you a software tester?
Being a software tester is  like being a food critic really – I, personally, have no idea how to make a chocolate soufflĂ© or a fricassee of mung beans and samphire, but I do know whether or not I like the taste. However, food critics have an advanced knowledge of food combinations, an objective and consistent opinion and tend to advocate high quality food. Software Testers are similar – they may not necessarily know how to develop the next Windows or Mac operating system but they will definitely know whether it’s good or not, and their opinion in the market place affects the view of whether it is a successful and popular product or not. It can make or break a version, product or even a company. 
However, even software testing skills are changing. Testers are becoming even more highly skilled and are bridging the gap between development and testing by learning coding techniques. This allows for more automated testing and makes the testing even more efficient and effective. With software becoming ever more sophisticated, the number of test scenarios that can arise from a seemingly simple piece of functionality can be mind boggling and reach the millions - it would take a human tester years to cover every scenario, and even a risk based approach would eat resource and not cover every possible outcome. As a consequence the work of software testers is becoming much more about using clever programmes and a variety of tools to cover as much ground as possible.
We know that we can never test every possible variable - it’s impossible, why else do Microsoft and Apple need updates? Things change and change needs testing. We can however, reduce risk – recent high profile cases in the press like Amazon, highlight the fact that even the slightest mistake can cost a company millions. Data is now one of the main currencies in the world and the Data Protection Act and privacy laws mean that breaches caused by software errors are treated with the highest level of severity and mistakes are not tolerated. Cloud computing, multiple access points and internet forums are all threats to a company’s reputation and balance sheet.
So back to the question – why do we need software testing? The answer is to reduce the risk of external failure. Internal failure such as a defect is fine as we can fix it and deal with it, however if software has an external failure then the world knows and it’s too late. Testers are a different breed, some say pedantic (and they are right) but without them who will check that a button on a website does what it should do and that it doesn’t do what it shouldn’t to the nth degree?
Here at TDX Group we strive to ensure that all our software is tested following industry best practice, the tools we use are cutting edge and the testers we hire are multi-skilled. We reduce risk and think of our customers – they don’t want 300 buttons when one will do! And we will continue to do so because we build our reputation on quality. We strive to reach the impossible goal and dream of the day we can say – you know what? We have managed to test everything. So next time you use a website and you click the submit button think of how much data has been validated, stored, organised, processed and actioned to get that button to work. And of the thousands of tests that will have checked that your date of birth entered is valid and correct, your password and username combination satisfies the criteria and everything just works – that’s because we checked it all.

By Paul Sibley, Software Testing Manager, TDX Group

Thursday, 10 July 2014

Cake, cake, cake

Working at TDX Group can be a challenge, and one of the biggest I’ve faced since joining the TDX Group team is all the goodies that are so regularly on offer to celebrate our success!

June saw the final round of the TDX Group cake bake off – the show-stopper round, and the celebratory afternoon tea. Now, I’m all for celebrating but it comes at a price; my diet app doesn’t like it!

Over the past 10 years I’ve been a slave to my weight. Like many people I’ve been on a range of diets, some successful and some not.  I’m under no illusion and realise that the main blocker to my success is usually me, after all, most diets are simply a controlled way of restricting calorie intake while promoting exercise. The similarity I’d like to draw between dietary habits and information security is that applying them both successful is a tricky balance between control and manageability.

During periods of over-indulgence, I’m without restriction and, quite frankly, anything can happen… Imagine a world where nothing is controlled, colleagues are left to get on with their day without security controls or restrictions. No content filtering to slow down progress, no anti-spam software to get in the way of legitimate emails that sometimes get blocked, no policies, procedural controls or anti-virus, etc. Viruses would quickly and easily get into the network, information would soon get lost or become compromised and our business would fall over; the weight gets piled on.

At the other end of the scale you could imagine something from Mission Impossible; security through ultimate control.  To access a system you enter a fort by passing through a guarded barrier with a photo ID proximity pass, you move on to another secure door with retina or fingerprint scanning, and then through a final secure door with a key-coded lock. Once inside you access a standalone system with no internet or network connectivity and use multi-factor authentication to log on to a PC which doesn’t permit removable media.  Nice and secure and there are no ways for a virus to get in, or data to get out, but the day job is impossible and the user will soon start to look for cheats and workarounds. Those 500 calorie a day diets have such strict controls in place that it seems impossible to stick to them while retaining your sanity; losing weight is guaranteed, but it’s unfeasible as a long term solution.

So, we apply a risk managed approach which compares what colleagues want to do against the long term risk of them doing it; too much control and they can’t work effectively and look for insecure alternatives, too little and things start to fall over…

My best dieting successes have come from a blend of control and balance; everything in moderation.  Losing control and having that big slice of cake won’t help with weight loss, and watching everyone eat while you stay in ultimate control may well send you crazy, but just a small slice will keep you happy and is unlikely to scupper the long term plan.

By Vicky Clayton – Information Security Officer, TDX Group
 

Wednesday, 2 July 2014

Looking good? The importance of design in Management Information

I have already talked about the principles behind making great Management Information (MI) but there is one final area that is often overlooked, despite being the most obvious: design. Truly great MI has to be well designed in order to have a real impact and to be really appreciated within a business.
Nowadays there is an ever-evolving love affair with data visualisation. Some see it as an opportunity to bring data and analysis to a wider audience through more relatable visuals whilst others see it as an art form in itself. However, data visualisation for me should do one of two things, either tell a story or bring a complex data set to life.

You will probably have seen infographics that tell a story, usually breaking down a topic to its key facts and broader implications to make for an engaging read, such as this gem on ‘Documents’. Infographics do have their use within a business however they are most powerful as marketing tools and a way of engaging with clients both new and existing. Turning complex data into a visual that makes instant sense is a difficult thing to do, as anyone who has ever tried to represent a large data set in Excel will know. Take for example this chart which shows the connections and activity of Facebook users across the globe . By transposing the data onto a familiar image (the earth) and representing activity through the neon lines we can easily relate to the data and instantly pick out interesting talking points such as China, South America and Africa. Not only is it functional, it is also beautiful, and I am a great believer in spending time on designing charts to both look good and be useful, it makes explaining them much easier.

In my time as a Consultant and as an Analyst at TDX Group I have put together many reports and MI dashboards, and have always been willing to put the extra time and effort in to making their appearance as good as their content. In a recent project I presented some example MI in the client’s branding, which enabled them to relate to the examples in a more meaningful way. Then the discussion could focus more on the concepts of building an MI suite as opposed to focussing on explaining unfamiliar examples.

I have also found that spending the time to make a chart look right has a great impact on how it is received. The biggest challenge is usually finding the best way to represent the relationships between data points and how they affect one another - the message is often lost when each point has its own visual but when combined into one chart it can change the conversation.

To me the design elements are just as crucial as getting the KPIs and the data correct. The design is often what will enable your MI to be read and understood on a wider scale. A well-designed MI suite reflects a knowledge and understanding of the business that gives confidence to those who rely on it on a daily basis.
By Stephen Hallam, Consultant, TDX Group

Tuesday, 24 June 2014

Head in the clouds

I recently read an article on the BBC news site about wastage in local government. The statistic that really stood out to me was that of the £440 million spent by councils on IT in 2012-2013 only £385,000 was spent via G-Cloud – the government’s digital marketplace for procurement of IT systems and services. That’s less than 0.1% of spending, a staggeringly small proportion in a period of widespread cuts and on-going efficiency drives.

The fact that councils aren’t embracing G-Cloud isn’t the biggest issue here; it’s the slow adoption of the wider concept of cloud based IT as the preferred approach. As of 2013 around 30% of councils used no cloud delivered services. The 70% embracing the cloud sounds promising, but when we dig deeper this tends to be in one or two niche areas within the council, or just email, with most local authorities continuing to spend the majority of funds on traditional on premise IT and maintaining legacy systems.
There are two main reasons I’m interested in this, the first being the most obvious one of cost. Cloud services tend to be cheaper. There is no hardware on site, meaning lower initial setup and on-going maintenance costs. This makes a big difference, as today 38% of IT budgets tend to be spent on support and maintenance.  You also avoid waste. With traditional on-site hardware a large proportion of the functionality and computing power may never be used, but with the cloud you can generally pick and mix from modular options, and the hardware itself can be shared with other users.
The second and more interesting reason though is innovation; to me the cloud means progress. Cloud services can be updated quickly with improvements rolled out to users remotely. Systems aren’t installed on site and forgotten about; they can evolve and improve, with all customers benefitting from new features and functionality. A cloud-based solution encourages the provider to work with their customers to optimise for the entire user-base, and not to have to develop bespoke solutions for every client. This drives innovation and can result in significant benefits for customers, with it being far easier to embrace new approaches and best-practice. Interestingly this comes back to my original point, sharing services between local authorities, or even between the public and private sectors doesn’t just save on IT costs, it results in better, more flexible systems which lead to improved services which are both more efficient and more effective – effectively you’re spending less and getting more.
One final thought while we’re talking about sharing. What about taking it a step further? Cloud services create the opportunity to share data and insight, not just servers and IT support. It might be a bit of a leap, particularly in the public sector, but knowing more is generally a good thing, and sharing data is a good way to get there. There may be hurdles to jump, but joining up these systems and maximising the use of data within and between local authorities whether in revenues and benefits, public transport or housing might have the potential to have a far greater impact on cost savings than the current practice of reducing household or community services
By Patrick O'Neil, Head of Pre-Sales Consulting, TDX Group 

Tuesday, 17 June 2014

The importance of solid foundations in a vendor management strategy

With an ever-growing increase in regulatory focus on the debt industry, 2014 is becoming the year where we are all focusing on creating solid foundations for growth, supported by innovative new ways of increasing performance through data, analytics and segmentation. Through analyzing what does and does not work in an Outside Collections Agency (OCA) management strategy, we can drive wide scale benefits, not least to performance.

The foundations of such a strategy fall broadly into three categories:
  • Data transfer – Is information being effectively transferred back and forth from OCAs
  • Process management – Are accounts fully reconciled and not getting stuck in any processes?
  • Portfolio visibility – Do you know exactly what suppliers are doing with each account?
We know that ineffective or broken collections foundations result in poor customer experience, for example: the need to re-supply information to agencies, delays in responding to queries or continued contact attempts to wrong numbers. These are exactly the customer challenges that are driving the current focus on the industry from regulators such as the Consumer Financial Protection Bureau (CFPB).

The good news is that resolving these issues will not only ensure adherence to regulatory guidelines but also drive significant collections uplifts as the customer experience is inextricably linked to performance.

In the 21st century it is important that all industry participants have an effective data transfer mechanism to and, just as importantly, from agencies, as this ensures data accuracy. Accurate data not only prevents incorrect contact attempts, but also supports agencies in the collections process. In addition, a fast turnaround of disputes not only improves the customer experience but also drives uplifts in resultant performance on these accounts by over 40%*.

Finally, having account level visibility of supplier activity not only meets regulatory requirements around supplier monitoring but also helps to fundamentally change the performance management discussions of vendor managers.

There are many more examples which demonstrate the importance of focusing on, and improving, the basic foundations of an OCA management strategy. This importance is becoming ever increasingly critical given the onset of growing regulatory requirements in third party supplier management. But the benefits of getting this right are far wider reaching; reducing wasted resource and driving significant performance uplifts – something which I’m sure all industry participants would welcome.
 
*source TDX Group data 2014




By Chris Smith, TDX Group

Wednesday, 11 June 2014

TDX moves down under

With TDX celebrating its 10th birthday this year, there's a lot of opportunity to look back at the history and evolution of the company; a small business created in a barn to meet a need in an evolving market has now become an expert in the industry.

Over my two years with TDX I've seen the rise of a number of exciting projects that are changing the market's landscape – not just in the UK but also overseas. This is why I am so excited about being a part of TDX's Australian (ad)venture! Last year we launched with Telstra, our flagship Australian client, and we are currently working together to deliver some fantastic results (and we’re already in talks about how we could revitalise their portfolio again) and we’ve recently taken on our second client.

It's easy to rest on our laurels and talk about the performance and compliance benefits that are realised within the first few months of taking on a new client or a new portfolio; but for those who are looking to the future, creating a rich data asset to be mined over the coming years is what makes the real difference. This is how we provide our clients with insight into their customer base post-acquisition which can be utilised not just to boost recoveries and collections, but also to ensure that all customers receive the right treatment depending on their circumstances.

It should come as no surprise that when an individual defaults with a telco, utility or line of credit with a financial institution, other defaults soon follow, as the root cause is often financial difficulty at the customer level. Having a single view of that customer outside of a one client portfolio allows you to ensure that you control the flow of activity to that customer and enables you to make sure that they are treated fairly - protecting your brand whilst also leveraging the customer’s recent contact and income and expenditure information to make the appropriate decision.

This isn’t just relevant to the UK or the Australian market; across the globe, regulations are being tightened and net performance is being squeezed due to increased cost to collect. In this context, technology platforms are a vital asset for making collections and recoveries both cost-effective – and fair for the long-run.

Applying this forward thinking, data-driven approach is what has kept TDX ahead of the game for the past decade, and what I’m sure will lead to a positive future in the Australian market. Whilst TDX's path in Australia differs from the UK, after all the market is different - it would be remiss of us not to acknowledge that we're standing on the shoulders of giants!

By Guy Bourne, Head of Analysis, TDX Australia

Monday, 2 June 2014

Where next for debt buyers?

Over the last year, the debt purchase market in the UK has been dominated by the arrival of the large US debt buyers looking for new opportunities away from their increasingly regulated domestic market. The interesting point here is that regulation is also being stepped up in the UK, and the influx of lower cost funding into the UK market has only served to push up pricing which has further depressed IRRs for key debt buyers.

So, with the UK market, like the US, becoming increasingly competitive and more heavily regulated, debt buyers must now look to other markets to purchase assets at high IRRs. One theory behind the rapid expansion from the US to the UK is that buyers are, effectively, using the UK as a bridgehead into debt purchase in potentially more lucrative European markets.

Some of the larger UK debt buyers are already looking at the Spanish market and have started acquiring assets and, most importantly, building performance datasets. However, a more general expansion across Europe has yet to really begin. The key thing holding most debt buyers back is the lack of outcome data and liquidation curves in these new markets, which is a bit of a chicken and egg situation. It’s hard to invest without the data, but you can’t acquire data without investing and learning about the markets.

This leaves debt buyers with two choices for expansion into the European market:
  • Partner with or acquire local entities who have outcome data from previous purchasers or agency activity.
  • Seed a number of markets with low value (and preferably high account volume) purchases to develop datasets for a ramp up of purchasing in the future.
It will be interesting to see how the different purchasers approach the European expansion challenge over the next 12 months. I think the really interesting feature in all this is that whilst some European markets are attractive, it is the emerging markets on a more global scale that really offer the best long-term strategic opportunities. Debt sale as a tool is increasingly prevalent outside of the developed markets and without significant external competition, local purchasers are being created to meet demand.
The opportunity for significant global growth is there for debt buyers, but it will require much more than just a Eurocentric vision.
 





By Stuart Bungay, Managing Director - International Expansion, TDX Group.