Tuesday, 14 October 2014

The ‘right price’

Recently I was asked by a seller what the right price for their debt was; they wanted to know how many pence in the pound they would get. This got me thinking about how much this concept has changed over time – not only the value but also the definition of ‘right’ price. I am not going to go into the reasons that different debts are worth different prices i.e quality of origination, current debtor situation mix, how hard it has been worked to date etc., I want to comment on the ‘evolution’ of debt sale.
 
Over the years I have seen three broad definitions for ‘right’ price. Almost eight years ago when I started out in this industry, the ‘right price’ equated for what is the most I can get for my debt? This era was typified by limited data being made available to purchasers and often the debts would be window dressed for sale. High turnover of purchaser panels was common place, with buyers often being ‘stung’ on price (it still is in some of the developing markets). In this era, sellers got to a position where it was difficult to sell debt for two reasons:
  1. Purchasers no longer trusted the seller or the quality of the debt.
  2. Those purchasers that did come back offered more realistic prices, but creditor expectations were still at the old, unrealistic, prices. 
During the middle ages, ‘right price’ was the price that can be achieved for my debt on a repeatable basis. This era was typified by more data being made available to buyers so they could build confidence in their pricing. As a result, large relatively stable panels were common place with buyers coming back for more debt at similar prices. In this period purchasers evolved the most – using more and more data to enable them to price accurately, reducing their desired rate of returns as the move towards transparency reduced their risk and they invested heavily in operational capability to improve returns.
 
Right now, ‘right price’ is the price that will ensure that my customers will be treated fairly. No longer is it purely about price maximisation. As a seller who now retains responsibility for accounts sold, if you seek too high a price it could drive a whole host of activities that wouldn’t fit your wider customer-centric philosophy.
 
In summary, the industry has moved from limited data exchange, to pre-sale openness, to transparency across the whole life of the customer. Creditors now want to not only know how their customers will be treated, but want evidence to prove they are being treated fairly.
 
I am not sure that everyone’s expectation of the right price has caught up with the times. But this is where we are most definitely headed.
 
 
 
 
 
 
 
 
 
 
By Nick Georgiades, Director of Advisory Services TDX Group

Monday, 6 October 2014

Third party oversight

Recent results of LSB review of subscribers’ handling of customers in financial difficulties.

I read with interest the recently published summary findings of the Lending Standards Boards’ (LSB) review of how subscribers to The Lending Code handled customers in financial difficulties.

For those not familiar with the detail, the LSB re-ran a set of monitoring first initiated in 2013. The review focused on the extent to which subscribers and their DCAs are handling customers in financial difficulties with a focus on the policies, processes and controls in place - including areas such as staff training, incentive schemes and complaint root cause analysis. Additionally, the review also assessed subscribers’ due diligence processes when selecting a third party for contingent collections or debt purchase and the oversight processes in place.

The LSB examined the governance frameworks and processes used by a sample of nine code subscribers and either a DCA or debt purchase firm used by each of them.

The results made for interesting reading. In summary, the reviews resulted in one ‘green’ rating, six ‘amber ’ and two ‘red’ ratings for the nine organisations assessed.

The report highlighted general weaknesses in a number of the firms reviewed including the adequacy of training of agents to deal with customers in financial difficulty and the completion of affordability assessments and the questioning of customers in financial difficulty. The report indicated, however, that the factors driving the red-rated and weaker amber reports were largely in relation to ineffective oversight by the subscriber over its outsourced activity and, in one case, inadequate due diligence conducted prior to the subscriber selling debt.

I think the report is interesting for a number of reasons:
  1. At a time when there is a lot of ‘noise’ around the requirement for financial service organisations to focus on FCA readiness it is a timely reminder that the FCA is only one part of a wider regulatory/compliance regime.
  2. It supports the need for creditors to learn from their peers and to benchmark their organisation against good/best practice from across the industry.  Whilst the report is critical of certain organisations practices it also calls out a number of examples of good practices and rates one organisation ‘green’ (a potential exemplar for their peers?).
  3. Finally, with lending levels set to increase as market conditions improve, there is likely to be increased demand for both DCAs and debt purchasers to help creditors manage their debt books as they grow. 
It is clear from the report that it is critical that all creditors ‘get their houses in order’ now, particularly with regard to ensuring there is an appropriate level of oversight and due diligence of third parties.









By Charlie Horner, Lead Consultant - Debt Sale and Advisory, TDX Group

Monday, 29 September 2014

So what exactly is a Product Manager?

I’ve been at TDX Group for six years this month. I know I look older, but that’s actually over half of my post-university working life. I’ve spent most of that time working within our Debt Sale business, focused mainly on the delivery of a service to our clients and becoming a subject matter expert on debt sale.

More recently, I started a project along with various internal teams on developing our new debt sale platform, VENDO. Then I got a chance for an internal move, into our Products department to formally take ownership of VENDO along with some of our Industry Solutions products. It’s a great opportunity: a chance to apply what I’ve learned over the last six years in a different way, whilst learning some new skills.

So having become a Product Manager, I thought I should be proactive and do some independent reading on product management practices. I started by looking online and Google took me to a website which was nice and clear, concise and talked about product management with a little venn diagram. It simply described a product manager as an intersection between Business, Technology and User Experience. It recommended a book which I duly bought and downloaded onto my Kindle.
I eagerly opened the book and scanned through the contents pages. 40 chapters spread over 220 pages. None of the chapters said ‘Summary’ or ‘Top three things you need to know’ or anything like that, so I put it down and thought I’d have a read later.

To appeal to someone like me, the book needs a nice summary; something to hook me in and help me to decide I want to read it. I guess it’s too late now that I’ve bought it but of course I won’t recommend it to anyone until I’ve read it and decided if it’s any good.

So I did learn one valuable lesson about Product Management from the book. You must think about your end user. I’m fairly sure I’m not unique in my desire for the five minute summary yet the author, editors and publisher failed to consider me when they created the product. They’ve missed out on appealing to a whole group of users.

I might get round to reading the book at some point. Thankfully I have a team of experienced colleagues around me who can help me learn more about good product management. But I certainly know that a good product needs to meet the needs of a range of users and that should be central to its design.

By Andy Taylor, Product Manager - Debt Sale, TDX Group

Friday, 18 July 2014

Why do we need Software Testing?

This is an excellent question, and one that regularly gets asked in organisations that have to deliver projects and software. Why can’t the developers just test it? Why can’t the end users test it? Surely anyone can test?

The growth of Software Testing as an industry over the last 20 years is a clear indication of the importance that large and small businesses place on having workable, easy to use software. It is no coincidence that this growth has accelerated as we now use software in everything we do – surfing the internet, in our cars, on our tablets and mobile devices, even typing this blog! So we, as users, should know what good looks like, and what bad looks like…..
We have all had moments when a programme crashes mid-use, data goes missing or when you’re trying to book a holiday and the web site illogically asks you to re-enter all your details again! So, by using these programmes - does this make you a software tester?
Being a software tester is  like being a food critic really – I, personally, have no idea how to make a chocolate soufflĂ© or a fricassee of mung beans and samphire, but I do know whether or not I like the taste. However, food critics have an advanced knowledge of food combinations, an objective and consistent opinion and tend to advocate high quality food. Software Testers are similar – they may not necessarily know how to develop the next Windows or Mac operating system but they will definitely know whether it’s good or not, and their opinion in the market place affects the view of whether it is a successful and popular product or not. It can make or break a version, product or even a company. 
However, even software testing skills are changing. Testers are becoming even more highly skilled and are bridging the gap between development and testing by learning coding techniques. This allows for more automated testing and makes the testing even more efficient and effective. With software becoming ever more sophisticated, the number of test scenarios that can arise from a seemingly simple piece of functionality can be mind boggling and reach the millions - it would take a human tester years to cover every scenario, and even a risk based approach would eat resource and not cover every possible outcome. As a consequence the work of software testers is becoming much more about using clever programmes and a variety of tools to cover as much ground as possible.
We know that we can never test every possible variable - it’s impossible, why else do Microsoft and Apple need updates? Things change and change needs testing. We can however, reduce risk – recent high profile cases in the press like Amazon, highlight the fact that even the slightest mistake can cost a company millions. Data is now one of the main currencies in the world and the Data Protection Act and privacy laws mean that breaches caused by software errors are treated with the highest level of severity and mistakes are not tolerated. Cloud computing, multiple access points and internet forums are all threats to a company’s reputation and balance sheet.
So back to the question – why do we need software testing? The answer is to reduce the risk of external failure. Internal failure such as a defect is fine as we can fix it and deal with it, however if software has an external failure then the world knows and it’s too late. Testers are a different breed, some say pedantic (and they are right) but without them who will check that a button on a website does what it should do and that it doesn’t do what it shouldn’t to the nth degree?
Here at TDX Group we strive to ensure that all our software is tested following industry best practice, the tools we use are cutting edge and the testers we hire are multi-skilled. We reduce risk and think of our customers – they don’t want 300 buttons when one will do! And we will continue to do so because we build our reputation on quality. We strive to reach the impossible goal and dream of the day we can say – you know what? We have managed to test everything. So next time you use a website and you click the submit button think of how much data has been validated, stored, organised, processed and actioned to get that button to work. And of the thousands of tests that will have checked that your date of birth entered is valid and correct, your password and username combination satisfies the criteria and everything just works – that’s because we checked it all.

By Paul Sibley, Software Testing Manager, TDX Group

Thursday, 10 July 2014

Cake, cake, cake

Working at TDX Group can be a challenge, and one of the biggest I’ve faced since joining the TDX Group team is all the goodies that are so regularly on offer to celebrate our success!

June saw the final round of the TDX Group cake bake off – the show-stopper round, and the celebratory afternoon tea. Now, I’m all for celebrating but it comes at a price; my diet app doesn’t like it!

Over the past 10 years I’ve been a slave to my weight. Like many people I’ve been on a range of diets, some successful and some not.  I’m under no illusion and realise that the main blocker to my success is usually me, after all, most diets are simply a controlled way of restricting calorie intake while promoting exercise. The similarity I’d like to draw between dietary habits and information security is that applying them both successful is a tricky balance between control and manageability.

During periods of over-indulgence, I’m without restriction and, quite frankly, anything can happen… Imagine a world where nothing is controlled, colleagues are left to get on with their day without security controls or restrictions. No content filtering to slow down progress, no anti-spam software to get in the way of legitimate emails that sometimes get blocked, no policies, procedural controls or anti-virus, etc. Viruses would quickly and easily get into the network, information would soon get lost or become compromised and our business would fall over; the weight gets piled on.

At the other end of the scale you could imagine something from Mission Impossible; security through ultimate control.  To access a system you enter a fort by passing through a guarded barrier with a photo ID proximity pass, you move on to another secure door with retina or fingerprint scanning, and then through a final secure door with a key-coded lock. Once inside you access a standalone system with no internet or network connectivity and use multi-factor authentication to log on to a PC which doesn’t permit removable media.  Nice and secure and there are no ways for a virus to get in, or data to get out, but the day job is impossible and the user will soon start to look for cheats and workarounds. Those 500 calorie a day diets have such strict controls in place that it seems impossible to stick to them while retaining your sanity; losing weight is guaranteed, but it’s unfeasible as a long term solution.

So, we apply a risk managed approach which compares what colleagues want to do against the long term risk of them doing it; too much control and they can’t work effectively and look for insecure alternatives, too little and things start to fall over…

My best dieting successes have come from a blend of control and balance; everything in moderation.  Losing control and having that big slice of cake won’t help with weight loss, and watching everyone eat while you stay in ultimate control may well send you crazy, but just a small slice will keep you happy and is unlikely to scupper the long term plan.

By Vicky Clayton – Information Security Officer, TDX Group
 

Wednesday, 2 July 2014

Looking good? The importance of design in Management Information

I have already talked about the principles behind making great Management Information (MI) but there is one final area that is often overlooked, despite being the most obvious: design. Truly great MI has to be well designed in order to have a real impact and to be really appreciated within a business.
Nowadays there is an ever-evolving love affair with data visualisation. Some see it as an opportunity to bring data and analysis to a wider audience through more relatable visuals whilst others see it as an art form in itself. However, data visualisation for me should do one of two things, either tell a story or bring a complex data set to life.

You will probably have seen infographics that tell a story, usually breaking down a topic to its key facts and broader implications to make for an engaging read, such as this gem on ‘Documents’. Infographics do have their use within a business however they are most powerful as marketing tools and a way of engaging with clients both new and existing. Turning complex data into a visual that makes instant sense is a difficult thing to do, as anyone who has ever tried to represent a large data set in Excel will know. Take for example this chart which shows the connections and activity of Facebook users across the globe . By transposing the data onto a familiar image (the earth) and representing activity through the neon lines we can easily relate to the data and instantly pick out interesting talking points such as China, South America and Africa. Not only is it functional, it is also beautiful, and I am a great believer in spending time on designing charts to both look good and be useful, it makes explaining them much easier.

In my time as a Consultant and as an Analyst at TDX Group I have put together many reports and MI dashboards, and have always been willing to put the extra time and effort in to making their appearance as good as their content. In a recent project I presented some example MI in the client’s branding, which enabled them to relate to the examples in a more meaningful way. Then the discussion could focus more on the concepts of building an MI suite as opposed to focussing on explaining unfamiliar examples.

I have also found that spending the time to make a chart look right has a great impact on how it is received. The biggest challenge is usually finding the best way to represent the relationships between data points and how they affect one another - the message is often lost when each point has its own visual but when combined into one chart it can change the conversation.

To me the design elements are just as crucial as getting the KPIs and the data correct. The design is often what will enable your MI to be read and understood on a wider scale. A well-designed MI suite reflects a knowledge and understanding of the business that gives confidence to those who rely on it on a daily basis.
By Stephen Hallam, Consultant, TDX Group

Tuesday, 24 June 2014

Head in the clouds

I recently read an article on the BBC news site about wastage in local government. The statistic that really stood out to me was that of the £440 million spent by councils on IT in 2012-2013 only £385,000 was spent via G-Cloud – the government’s digital marketplace for procurement of IT systems and services. That’s less than 0.1% of spending, a staggeringly small proportion in a period of widespread cuts and on-going efficiency drives.

The fact that councils aren’t embracing G-Cloud isn’t the biggest issue here; it’s the slow adoption of the wider concept of cloud based IT as the preferred approach. As of 2013 around 30% of councils used no cloud delivered services. The 70% embracing the cloud sounds promising, but when we dig deeper this tends to be in one or two niche areas within the council, or just email, with most local authorities continuing to spend the majority of funds on traditional on premise IT and maintaining legacy systems.
There are two main reasons I’m interested in this, the first being the most obvious one of cost. Cloud services tend to be cheaper. There is no hardware on site, meaning lower initial setup and on-going maintenance costs. This makes a big difference, as today 38% of IT budgets tend to be spent on support and maintenance.  You also avoid waste. With traditional on-site hardware a large proportion of the functionality and computing power may never be used, but with the cloud you can generally pick and mix from modular options, and the hardware itself can be shared with other users.
The second and more interesting reason though is innovation; to me the cloud means progress. Cloud services can be updated quickly with improvements rolled out to users remotely. Systems aren’t installed on site and forgotten about; they can evolve and improve, with all customers benefitting from new features and functionality. A cloud-based solution encourages the provider to work with their customers to optimise for the entire user-base, and not to have to develop bespoke solutions for every client. This drives innovation and can result in significant benefits for customers, with it being far easier to embrace new approaches and best-practice. Interestingly this comes back to my original point, sharing services between local authorities, or even between the public and private sectors doesn’t just save on IT costs, it results in better, more flexible systems which lead to improved services which are both more efficient and more effective – effectively you’re spending less and getting more.
One final thought while we’re talking about sharing. What about taking it a step further? Cloud services create the opportunity to share data and insight, not just servers and IT support. It might be a bit of a leap, particularly in the public sector, but knowing more is generally a good thing, and sharing data is a good way to get there. There may be hurdles to jump, but joining up these systems and maximising the use of data within and between local authorities whether in revenues and benefits, public transport or housing might have the potential to have a far greater impact on cost savings than the current practice of reducing household or community services
By Patrick O'Neil, Head of Pre-Sales Consulting, TDX Group