Telecommunications, Cable & Utilities

Telecommunications, Cable & Utilities

Loading...

TL;DR Read within as to how Touch ID is made possible via ARM’s TrustZone/TEE, and why this matters in the context of the coming Apple’s identity framework. Also I explain why primary/co-processor combos are here to stay. I believe that eventually, Touch ID has a payments angle – but focusing on e-commerce before retail. Carriers will weep over a lost opportunity while through Touch ID, we have front row seats to Apple’s enterprise strategy, its payment strategy and beyond all – the future direction of its computing platform. I had shared my take on a possible Apple Biometric solution during the Jan of this year based on its Authentec acquisition. I came pretty close, except for the suggestion that NFC is likely to be included. (Sigh.) Its a bit early to play fast and loose with Apple predictions, but its Authentec acquisition should rear its head sometime in the near future (2013 – considering Apple’s manufacturing lead times), that a biometric solution packaged neatly with an NFC chip and secure element could address three factors that has held back customer adoption of biometrics: Ubiquity of readers, Issues around secure local storage and retrieval of biometric data, Standardization in accessing and communicating said data. An on-chip secure solution to store biometric data – in the phone’s secure element can address qualms around a central database of biometric data open to all sorts of malicious attacks. Standard methods to store and retrieve credentials stored in the SE will apply here as well. Why didn’t Apple open up Touch ID to third party dev? Apple expects a short bumpy climb ahead for Touch ID before it stabilizes, as early users begin to use it. By keeping its use limited to authenticating to the device, and to iTunes – it can tightly control the potential issues as they arise. If Touch ID launched with third party apps and were buggy, it’s likely that customers will be confused where to report issues and who to blame. That’s not to say that it won’t open up Touch ID outside of Apple. I believe it will provide fettered access based on the type of app and the type of action that follows user authentication. Banking, Payment, Productivity, Social sharing and Shopping apps should come first. Your fart apps? Probably never. Apple could also allow users to set their preferences (for app categories, based on user’s current location etc.) such that biometrics is how one authenticates for transactions with risk vs not requiring it. If you are at home and buying an app for a buck – don’t ask to authenticate. But if you were initiating a money transfer – then you would. Even better – pair biometrics with your pin for better security. Chip and Pin? So passé. Digital Signatures, iPads and the DRM 2.0: It won’t be long before an iPad shows up in the wild sporting Touch ID. And with Blackberry’s much awaited and celebrated demise in the enterprise, Apple will be waiting on the sidelines – now with capabilities that allow digital signatures to become ubiquitous and simple – on email, contracts or anything worth putting a signature on. Apple has already made its iWork productivity apps(Pages, Numbers, Keynote), iMovie and iPhoto free for new iOS devices activated w/ iOS7. Apple, with a core fan base that includes photographers, designers and other creative types, can now further enable iPads and iPhones to become content creation devices, with the ability to attribute any digital content back to its creator by a set of biometric keys. Imagine a new way to digitally create and sign content, to freely share, without worrying about attribution. Further Apple’s existing DRM frameworks are strengthened with the ability to tag digital content that you download with your own set of biometric keys. Forget disallowing sharing content – Apple now has a way to create a secondary marketplace for its customers to resell or loan digital content, and drive incremental revenue for itself and content owners. Conclaves blowing smoke: In a day and age where we forego the device for storing credentials – whether it be due to convenience or ease of implementation – Apple opted for an on-device answer for where to store user’s biometric keys. There is a reason why it opted to do so – other than the obvious brouhaha that would have resulted if it chose to store these keys on the cloud. Keys inside the device. Signed content on the cloud. Best of both worlds. Biometric keys need to be held locally, so that authentication requires no roundtrip and therefore imposes no latency. Apple would have chosen local storage (ARM’s SecurCore) as a matter of customer experience, and what would happen if the customer was out-of-pocket with no internet access. There is also the obvious question that a centralized biometric keystore will be on the crosshairs of every malicious entity. By decentralizing it, Apple made it infinitely more difficult to scale an attack or potential vulnerability. More than the A7, the trojan in Apple’s announcement was the M7 chip – referred to as the motion co-processor. I believe the M7 chip does more than just measuring motion data. M7 – A security co-processor? I am positing that Apple is using ARM’s TrustZone foundation and it may be using the A7 or the new M7 co-processor for storing these keys and handling the secure backend processing required. Horace Dediu of Asymco had called to question why Apple had opted for M7 and suggested that it may have a yet un-stated use. I believe M7 is not just a motion co-processor, it is also a security co-processor. I am guessing M7 is based on the Cortex-M series processors and offloads much of this secure backend logic from the primary A7 processor and it may be that the keys themselves are likely to be stored here on M7. The Cortex-M4 chip has capabilities that sound very similar to what Apple announced around M7 – such as very low power chip, that is built to integrate sensor output and wake up only when something interesting happens. We should know soon. This type of combo – splitting functions to be offloaded to different cores, allows each cores to focus on the function that it’s supposed to performed. I suspect Android will not be far behind in its adoption, where each core focuses on one or more specific layers of the Android software stack. Back at Google I/O 2013, it had announced 3 new APIs (the Fused location provider) that enables location tracking without the traditional heavy battery consumption. Looks to me that Android decoupled it so that we will see processor cores that focus on these functions specifically – soon.                   I am fairly confident that Apple has opted for ARM’s Trustzone/TEE. Implementation details of the Trustzone are proprietary and therefore not public. Apple could have made revisions to the A7 chip spec and could have co-opted its own. But using the Trustzone/TEE and SecurCore allows Apple to adopt existing standards around accessing and communicating biometric data. Apple is fully aware of the need to mature iOS as a trusted enterprise computing platform – to address the lack of low-end x86 devices that has a hardware security platform tech. And this is a significant step towards that future. What does Touch ID mean to Payments? Apple plans for Touch ID kicks off with iTunes purchase authorizations. Beyond that, as iTunes continue to grow in to a media store behemoth – Touch ID has the potential to drive fraud risk down for Apple – and to further allow it to drive down risk as it batches up payment transactions to reduce interchange exposure. It’s quite likely that à la Walmart, Apple has negotiated rate reductions – but now they can assume more risk on the front-end because they are able to vouch for the authenticity of these transactions. As they say – customer can longer claim the fifth on those late-night weekend drunken purchase binges. Along with payment aggregation, or via iTunes gift cards – Apple has now another mechanism to reduce its interchange and risk exposure. Now – imagine if Apple were to extend this capability beyond iTunes purchases – and allow app developers to process in-app purchases of physical goods or real-world experiences through iTunes in return for better blended rates? (instead of Paypal’s 4% + $0.30). Heck, Apple can opt for short-term lending if they are able to effectively answer the question of identity – as they can with Touch ID. It’s Paypal’s ‘Bill Me Later’ on steroids. Effectively, a company like Apple who has seriously toyed with the idea of a Software-SIM and a “real-time wireless provider marketplace” where carriers bid against each other to provide you voice, messaging and data access for the day – and your phone picks the most optimal carrier, how far is that notion from picking the cheapest rate across networks for funneling your payment transactions? Based on the level of authentication provided or other known attributes – such as merchant type, location, fraud risk, customer payment history – iTunes can select across a variety of payment options to pick the one that is optimal for the app developer and for itself. And finally, who had the most to lose with Apple’s Touch ID? Carriers. I wrote about this before as well, here’s what I wrote then (edited for brevity): Does it mean that Carriers have no meaningful role to play in commerce? Au contraire. They do. But its around fraud and authentication. Its around Identity. … But they seem to be stuck imitating Google in figuring out a play at the front end of the purchase funnel, to become a consumer brand(Isis). The last thing they want to do is leave it to Apple to figure out the “Identity management” question, which the latter seems best equipped to answer by way of scale, the control it exerts in the ecosystem, its vertical integration strategy that allows it to fold in biometrics meaningfully in to its lineup, and to start with its own services to offer customer value. So there had to have been much ‘weeping and moaning and gnashing of the teeth’ on the Carrier fronts with this launch. Carriers have been so focused on carving out a place in payments, that they lost track of what’s important – that once you have solved authentication, payments is nothing but accounting. I didn’t say that. Ross Anderson of Kansas City Fed did. What about NFC? I don’t have a bloody clue. Maybe iPhone6? iPhone This is a re-post from Cherian's original blog post "Smoke is rising from Apple's Conclave"

Published: October 2, 2013 by Cherian Abraham

By: Matt Sifferlen I recently read interesting articles on the Knowledge@Wharton and CNNMoney sites covering the land grab that's taking place among financial services startups that are trying to use a consumer's social media activity and data to make lending decisions.  Each of these companies are looking at ways to take the mountains of social media data that sites such as Twitter, Facebook, and LinkedIn generate in order to create new and improved algorithms that will help lenders target potential creditworthy individuals.  What are they looking at specifically?  Some criteria could be: History of typing in ALL CAPS or all lower case letters Frequent usage of inappropriate comments Number of senior level connections on LinkedIn The quantity of posts containing cats or annoying  self-portraits (aka "selfies") Okay, I made that last one up. The point is that these companies are scouring through the data that individuals are creating on social sites and trying to find useful ways to slice and dice it in order to evaluate and target consumers better. On the consumer banking side of the house, there are benefits for tracking down individuals for marketing and collections purposes. A simple search could yield a person's Facebook, Twitter, or LinkedIn profile. The behaviorial information can then be leveraged as a part of more targeted multi-channel and contact strategies. On the commercial banking side, utilizing social site info can help to supplement any traditional underwriting practices. Reviewing the history of a company's reviews on Yelp or Angie's List could share some insight into how a business is perceived and reveal whether there is any meaningful trend in the level of negative feedback being posted or potential growth outlook of the company. There are some challenges involved with leveraging social media data for these purposes. 1. Easily manipulated information 2. Irrelevant information that doesn't represent actual likes, thoughts or relevant behaviors 3. Regulations From a Fraud perspective, most online information can easily and frequently be manipulated which can create a constantly moving target for these providers to monitor and link to the right customer.  Fake Facebook and Twitter pages, false connections and referrals on LinkedIn, and fabricated positive online reviews of a business can all be accomplished in a matter of minutes. And commercial fraudsters are likely creating false business social media accounts today for shelf company fraud schemes that they plan on hatching months or years down the road.  As B2B review websites continue to make it easier to get customers signed up to use their services, the downside is  there will be even more unusable information being created since there are less and less hurdles for commercial fraudsters to clear, particularly for sites that offer their services for free. For now, the larger lenders are more likely to utilize alternative data sources that are third party validated, like rent and utility payment histories, while continuing to rely on tools that can prevent against fraud schemes. It will be interesting to see what new credit and non credit data will be utilized as a common practice in the future as lenders continue their efforts to find more useful data to power their credit and marketing decisions.

Published: September 25, 2013 by Guest Contributor

Contact information such as phone numbers and addresses are fundamental to being able to reach a debtor, but knowing when to reach out to the debtor is also a crucial factor impacting success or failure in getting payment. As referenced in the chart below, when a consumer enters the debtor life cycle, they often avoid talking with you about the debt because they do not have the ability to pay. When the debtor begins to recover financially, you want to be sure you are among the first to reach out to them so you can be the first to be paid. According to Don Taylor, President of Automated Collection Services, they have seen a lift of more than 12% of consumers with trigger hits entering repayment, and this on an aged portfolio that has already been actively worked by debt collection staff. Monitoring for a few key changes on the credit profiles of debtors provides the passive monitoring that is needed to tell you the optimal time to reach back to the consumer for payment. Experian compiled several recent collection studies and found that a debtor paying off an account that was previously past due provided a 710% increase in the average payment.  Positive improvement on a consumers’ credit profile is one of those vital indicators that the consumer is beginning to recover financially and could have the will—and ability—to pay bad debts.  The collection industry is not like the big warehouse stores—quantity and value do not always work hand in hand for the debt collection industry. Targeting the high value credit events that are proven to increase collection amounts is the key to value, and Experian has the expertise, analytics and data to help you collect in the most effective manner. Be sure to check out our other debt collection blog posts to learn how to recover debt more quickly and efficiently.

Published: June 10, 2013 by Guest Contributor

  At midnight yesterday, Google sent me an email on how the new GoogleWallet update will now allow me to store my “Citi MasterCard” online. As other Google Wallet aficionados may recall (Bueller..? Bueller..?), Citi was the lone holdout in Google Wallet’s journey to the cloud and its race to conformity. Though to the untrained eye the Google Wallet app experience was mostly uniform irrespective of the card used to pay at the point-of-sale, behind the scenes, if the Citi MasterCard was used, Google had to do things one way versus another way for the rest of the brood. Furthermore, sharing the precious real estate that is the Secure Element with Citi meant that Google had very little room to maneuver. Embedded SEs, despite being newer to market than SIM-based SE’s, were limited in storage versus other chips. The initial embedded SEs that Google Wallet relied on had about 76KB memory, which once you factor in all the trimmings that come with provisioning a card to SE (MasterCard PayPass applet among others), left very little wiggle room. So Google, forced by a number of factors (resistance from the carriers and issuers, rising costs and complexities attributed to the multiple TSM model, a lack of SE space to accommodate future provisioning) migrated to the cloud — and left a MasterCard proxy on the wallet that it could use to funnel transactions through. The only standout to this model was the umbilical cord to the original Google Wallet partner: Citi. I had predicted last September that the partnership’s days were numbered. When the wallet is Google’s, and it needs to both reclaim the space on SE and reduce the provisioning or account management costs that it owes to its TSM (FirstData), the only reason for it to carry the torch for Citi would be if Google Wallet customers demanded it. But it so happens that any returns for items purchased using Google Wallet untill today had also been slightly broken. If you bought an item using the virtual MasterCard then the returns followed one route; of you purchased an item via the Citi card then returns were handled a different way. Additionally, It was disappointing for a customer to see “Paypass Merchant” instead of “McDonalds” and “Sent” instead of “$25.54″ when paying with the Citi card in GoogleWallet(unless one was planning to hide a fastfood habit from a spouse). A small mess – especially when it should be attributed to powers beyond the partnership, but still a mess for Google who demands conformity in customer experience across all its offerings. In the end, this partnership served no broader purpose for either partner to keep alive for any longer. Google is ready to move on beyond Wallet 1.0 and realizes that it can do so without issuers in tow. Furthermore, it had been expected for a better part of three months that Google will launch its partnership with Discover and this puts Google as an indispensable element back in the mobile payment narrative. For the issuers who were originally courted by Google Wallet in its early days this serves as validation, that they were correct in choosing to stay away. But that is no excuse for ignoring what Google and others are building as a parallel framework to the value-added services (credit card rewards being one) card issuers use to show that customers will choose them over Google. (But if Google could tout interchange relief to merchants as an incentive to court them, don’t you think a Google Rewards program will be close behind, supported by credits redeemable the Google Play store? Once again, it’s not an if, but when.) Finally, where does this leave Citi? Citi is a global institution with enough smart people at their end to make up for lost time. Google Wallet did not become the boogeyman that issuers feared back in 2011, and Citi can afford to roll out its own mobile initiatives in a measured pace at a global scale. And there had been rumblings of a Citi wallet all through 2012 and we may see it soon manifest outside of the U.S. before Citi attempts to do so here. Google may have opted to cut the cord so that there is no ambiguity when that happens. But they still have both Citi and FirstData to thank for bringing it to the prom. You dance with the one that brung ya…or something like it. Do you think this means GoogleWallet is now adrift, loyal to its own quest? What’s next for Citi? What do you think? Please leave your opinions below. This is a re-post from Cherian's personal blog at DropLabs  

Published: January 31, 2013 by Cherian Abraham

First, it aims to drastically reduce payment acceptance costs through any and all means and Secondly – keep merchant data firmly within their purview. MCX – MerChants reduX: The post that follows is a collection of thoughts around MCX, why it deserves respect, and yet how it is indeed mortal and bleeds like all others. For those who are not familiar with MCX – it’s a consortium of over 30 leading national retailers with a singular purpose – that is, to create a seamlessly integrated mobile commerce platform. The website for MCX is http://www.mcx.com. The consortium is led by merchants like Walmart, Target, CVS, BestBuy, Gap, Sears etc. By 2012, the mobile payments space was fragmented as it is, which itself may have precipitated the launch of MCX. And to a number of solutions looking for traction, things ground to a halt when MCX conceptualized to the merchants a solution that needed no costly upgrades and a promise to route the transaction over low cost routing options. My friends on the issuer side privately confide that MCX has infact succeeded in throwing a monkey wrench in their mobile payment plans – and merchant acceptance looks to be ambiguous around incumbent initiatives such as Isis and GoogleWallet, as well as for alternative payment initiatives. It had been easy to call it mere posturing and ignore it in the early days, but of late there is a lot of hand wringing behind the scenes and too many furrowed brows, as if the realization finally struck that merchants were indeed once again crucial to mobile payment adoption. MCX – It’s raison d’etre Meanwhile, the stakeholders behind MCX have been religious in their affirmation that MCX lives by two core tenets: First, it aims to drastically reduce payment acceptance costs through any and all means and Secondly – keep merchant data firmly within their purview. I can’t seem to think that the latter was any more than an after thought, because merchants individually can choose to decide if they wish to share customer preferences or Level III data with third parties, but they need all the collective clout they can muster to push networks and issuers to agree to reduce card acceptance costs. So if one distils MCX down to its raison d’etre, then it looks that it is aimed squarely at No.1. Which is fair when you consider that the merchants believe card fees are one of their biggest operating expenses. In 2007, 146,000 convenience stores and gas stations nationwide made a total of $3.4B in profits, yet they paid out $7.6B in card acceptance costs(Link). And MCX is smart to talk about the value of merchant data, the need to control it, yada yada yada. But if that were indeed more important, Isis could have been the partner of choice – someone who would treat customer and transaction data as sacrosanct and leave it behind for the merchants to fiddle with(vs. GoogleWallet’s mine..mine..mine.. strategy). But the same way HomeDepot was disappointed when they first saw GoogleWallet – no interchange relief, incremental benefits at the point-of-sale, and swoops all their data in return, Isis also offers little relief to MCX or its merchants, even without requiring any transaction or SKU level data in return. Does it mean that Carriers have no meaningful role to play in commerce? Au contraire. They do. But its around fraud and authentication. Its around Identity. And creating a platform for merchants to deliver coupons, alerts to opted-in customers. But they seem to be stuck imitating Google in figuring out a play at the front end of the purchase funnel, to become a consumer brand. The last thing they want to do is leave it to Apple to figure out the “Identity management” question, which the latter seems best equipped to answer by way of scale, the control it exerts in the ecosystem, its vertical integration strategy that allows it to fold in biometrics meaningfully in to its lineup, and to start with its own services to offer customer value. Did we say Apple? Its a bit early to play fast and loose with Apple predictions, but its Authentec acquisition should rear its head sometime in the near future (2013 – considering Apple’s manufacturing lead times), that a biometric solution packaged neatly with an NFC chip and secure element could address three factors that has held back customer adoption of biometrics: Ubiquity of readers, Issues around secure local storage and retrieval of biometric data, Standardization in accessing and communicating said data. An on-chip secure solution to store biometric data – in the phone’s secure element can address qualms around a central database of biometric data open to all sorts of malicious attacks. Standard methods to store and retrieve credentials stored in the SE will apply here as well. Why NFC? If NFC was originally meant to seamlessly and securely share content, what better way to sign that content, to have it be attributable to its original author, or to enforce one’s rights to said content – than to sign it with one’s digital signature. Identity is key, not just when enforcing digital rights management on shared content, but also to secure commerce and address payment/fraud risk. Back to MCX.  The more I read the more it seems MCX is trying to imitate Isis in competing for the customer mindshare, in attempting to become a consumer brand – than simply trying to be a cheaper platform for payment transactions. As commerce evolved beyond being able to be cleanly classified under “Card Present” and “Card Not Present” – as transactions originate online but get fulfilled in stores, merchants expect rules to evolve alongside reality. For example, when customers are able to order online, but pick up in-store after showing a picture ID, why would merchants have to pay “Card not Present” rates when risk is what we attribute higher CNP rates to, and why is there an expectation of the same amount of risk even in this changed scenario? And beyond, as technology innovation blurs the lines that neatly categorized commerce, where we replace “Card Present” with “Mobile Present”, and mobile carry a significant amount of additional context that could be scored to address or quantify risk, why shouldn’t it be?. It’s a given that networks will have to accommodate for reduced risk in transactions where mobile plays a role, where the merchant or the platform enabling the transaction can meaningfully use that context to validate customer presence at the point-of-sale – and that they will expect appropriate interchange reduction in those scenarios. MCX – A brand like Isis or a platform? But when reading portions of the linked NRF blog, and elsewhere – it reflects a misplaced desire on MCX’s part to become a consumer facing solution – an app that all MCX partners will embrace for payment. This is so much like the Isis solution of today – that I have written about – and why it flies in the face of reason. Isis – the nexus between Carriers and FI’s – is a powerful notion, if one considers the role it could play in enabling an open platform – around provisioning, authentication and marketing. But for that future to materialize, Isis has to stop competing with Google, and must accept that it has little role to play by itself at the front end of the funnel, and must recede to its role of an enabler – one that puts its partner FI brands front and center, allows Chase’s customers to pay using Chase’s mobile app instead of Isis, and drives down the fraud risk at the point of sale by meaningfully authenticating the customer via his location and mobile assets Carriers control, and further – the historical data they have on the customer. It’s those three points of data and the scale Isis can bring, that puts them credibly in the payments value chain – not the evaporating control around the Secure Element. In the same vein, the value MCX brings to merchants – is the collective negotiating power of over 30 national merchants. But is it a new consumer brand, or is it a platform focused on routing the transaction over the least cost routing option. If its the latter, then it has a strong parallel in Paypal. And as we may see Paypal pop-up as legal tender in many a retailer’s mobile apps and checkout aisles going forward, MCX is likely to succeed by emulating that retailer aligned strategy than follow a brand of its own. Further, If MCX wants customers to pay using less costly means – whether they be private label, prepaid or ACH – then it and its partners must do everything they can to shift the customer focus away from preferred payment methods and focus on the customer experience and resulting value around loyalty. MCX must build its value proposition elsewhere, and make their preferred payment methods the bridge to get the customer there. Another example where the retailer focused too much on the payment, and less so on the customer experience is the Safeway Fast Forward program. The value proposition is clear for the customer – Pay using your Safeway Fast Forward card number and a self assigned PIN for simpler checkout. However to set up your account, the customer must provide a State issued ID (Drivers License) and on top of it – his Social Security Number(Safeway Fast Forward Requirements Here). What customer would, for the incremental convenience of paying via his Fast Forward Card and PIN, be willing to entrust Safeway with his Social Security Number? Clearly Safeway’s Risk team had a say in this and instead of coming up with better ways to answer questions around Risk and Fraud, they introduced a non-starter, which killed any opportunity for meaningful adoption. MCX & adoption So where does that leave MCX? Why will I use it? How will it address questions around adoption? It’s a given that it will have to answer the same questions around fraud and authentication during customer on-boarding or at a transactional level. Further, its not enough these days to simply answer questions pertaining to the customer. Further, one must address questions relating to the integrity and reputation of the device the customer use – whether that be a mobile device or a Laptop PC. But beyond fraud and auth, there are difficult questions around what would compel a techno-luddite who has historically paid using a credit instrument to opt for an ACH driven(i am guessing) MCX payment scheme. Well, for one: MCX and its retail partners can control the purchasing power parity of MCX credits. If they so wish, and after aggregating customer profiles across retailers, MCX determines that the Addams family spends a collective $400 on average per month between all the MCX retailers. MCX could propose that if instead, the Addams family were to commit to buy $450 in MCX credits each month, they could increase their purchasing power an additional $45 credits that could be used on specific retail categories (or flat out across all merchandise)? Would Morticia be interested? If she did, what does that mean to MCX? It eliminated having to pay interchange on approx $500, and further it enabled its partners to capture an incremental spend of 10% that did not exist before. Only merchants will be able to pull this off – by leveraging past trends, close relationships with CPG manufacturers and giving Morticia new reasons to spend in the manner they want her to. But then again, where does MCX stop in providing a level playing field for its partners, and step back – so that merchants can start to compete for their customers and their spend? And finally, can it survive the natural conflicts that will arise, and limit its scope to areas that all can agree – for long enough for it to take root? Should MCX become the next Isis or the next Paypal? Which makes most sense? What do you think? Please leave your opinions below... (This blog post is an adaptation of its original post found - http://www.droplabs.co/?p=662)  

Published: January 25, 2013 by Cherian Abraham

All skip tracing data is the same, right? Not exactly. While there are many sources of consumer contact data available to debt collectors, the quality, freshness, depth and breadth can vary significantly. Just as importantly, what you ultimately do or don't do with the data depends on several factors such as: Whether or not the debt is worth your while to pursue How deep and fresh the data is What if no skip data is available, and, What happens if there is no new information available when you go to your skip-tracing vendor requesting new leads? So what's the best way for your company to locate debtors? What data sources are right for you? Check out my recent article in Collections and Credit Risk for some helpful advice, and be sure to check out our other debt collection industry blog posts for best practices, tips and tricks on ways to recover more debt, faster. What data sources do you find most beneficial to your business and why? Let us know by commenting below.

Published: January 22, 2013 by Guest Contributor

It comes as no surprise to anyone that cell phone usage continues to rise, while at the same time the usage of wire lines, or what used to be affectionately known as POTS (Plain Old Telephone Service), continues to decline. Some recent statistics, supplied by the CDC show that: 34% of all households are now wireless only 25 states have rates of primary wireless exceeding 50% Landline only households is now down to only 10.2% When you couple that with churn rates for cell phones that can exceed 40% a year, it becomes paramount to find a good source for cell numbers if you are trying to contact an existing customer or collect on an overdue bill. But where can debt collectors go to find reliable cell phone numbers? The cell phone providers won’t sell you a database, there is no such thing as 411 for cell phones, nor is it likely there will be one in the near future with the aforementioned 40%+ churn rates. Each cell phone service provider will continue to protect their customer base. There are a few large compilers of cell phone numbers; they mostly harvest these numbers from surveys and sources that capture the numbers as a part of an online service—think ringtones here! These numbers can be good, at least initially, if they came with an address which enables you to search for them. The challenge is that these numbers can grow stale relatively quickly. Companies that maintain recurring transactions with consumers have a better shot at having current cell numbers. Utilities and credit bureaus offer an opportunity to capture these self-reported numbers. At our company, over 40% of self-reported phones are cell phones. However, in most cases, you must have a defined purpose as governed by Gramm Leach Bliley (GLB) in order to access them. Of course, the defined purpose also goes hand in hand with the Telephone Consumer Protection Act (TCPA), which restricts use of automatic dialers and prohibits unsolicited calls via a cell phone. Conclusion? If you are trying to find someone’s cell number for debt collection purposes, I recommend using a resource more likely to receive updates on the owner of a cell over that of compilers who are working with one time event data. In today’s world, obtaining an accurate good cell number is a challenge and will continue to be. What cell phone number resources have been most effective for you?

Published: October 31, 2012 by Guest Contributor

Contributed by: David Daukus As the economy recovers from the recession, consumers are becoming more responsible with their credit card usage; credit card debts have not increased and delinquency rates have declined. Delinquency rates as a percentage of balances continue to decline with the short term 30-59 DPD period, now at 0.9%. With mixed results, where is the profit opportunity? Further studies from Experian-Oliver Wyman state that the average bankcard balance per consumer remained relatively flat at $4,170, but the highest credit tiers (using VantageScore® credit score A and B segments) saw average balances increase to $2,422 and $3,208, respectively. It's time to focus on what you have—your current portfolio—and specifically how to: Increase credit card usage in the prime segments Assign the right lines to your cardholders Understand who has the ‘right’ spend Risk score alone doesn't provide the most accurate insight into consumer accounts. You need to dig deeper into individual accounts to uncover behavioral trends to get the critical information needed to grow your portfolio:  Leading financial institutions are looking at consumer payment history, such as balance and utilization changes. These capture a consumer’s credit situation more accurately than a point in time view. When basic principles are applied to credit data, different consumer behaviors become evident and can be integrated into client strategies. For example, if two consumers have the same VantageScore® credit score, credit card balances, and payment status, does that mean they have the same current credit status? Not necessarily so. By looking at their payment history, you can determine which direction each is heading. Are they increasing their debt or are they paying down their debt? These differences reveal their riskiness and credit needs. Therefore, with payment history added to the mix, you can more accurately allocate credit lines between consumers and simultaneously reduce risk exposure. Spend is another important metric to evaluate to help grow your portfolio. How do you know if a consumer uses primary a credit card when making purchases? Wouldn’t you want to know the right amount of credit to provide based on the consumer’s need? Insight into consumer spending levels provides a unique understanding of a consumer’s credit needs. Knowing spend allows lenders to provide necessary high lines to the limited population of very high spenders, while reducing overall exposure by providing lower lines to low spenders. Spend data also reveals wallet share—knowing the total spend of your cardholder allows you to calculate their external spend. With wallet share data, you can capture more spend by adjusting credit lines or rewards that will entice consumers to spend more using your card. Once you have a more complete picture of a consumer, adjusting lines of credit and making the right offer is much easier. Take some of the risk out of managing your existing customers and finding new ones. What behavioral data have you found most beneficial in making lending decisions?   Source: Experian-Oliver Wyman Market Intelligence Reports

Published: October 24, 2012 by Guest Contributor

I'm here in Vegas at the Mobile2020 conference and I am fascinated by my room key. This is not the usual “insert in to the slot, wait for it turn green or hear it chime” key cards, these are “tap and hold to a door scanner till the door opens” RFID key card. It is befitting the event I am about to attend – Money2020 – the largest of its kind bringing together over 2000 mobile money aficionados, strategists and technologists from world over for a couple of days to talk about how payment modalities are shifting and the impact of these shifts to existing and emerging players. Away from all the excitement of product launches, I hope some will be talking about one of the major barriers for consumer adoption towards alternate payment modalities such as mobile – security and fraud.  I was in Costa Mesa last week and in the process of buying something for my wife with my credit card, triggered the card fraud alert. My card was declined and I had to use a different card to complete my transaction. As I was walking out, my smartphone registers a text alert from the card issuer – asking me to confirm that it was actually I who attempted the transaction. And If so, Respond by texting 1 – if Yes Or 2 – if No. All good and proper up till this point. If someone had stolen my card or my identity, this would have been enough to stop fraud from re-occurring. In this scenario the payment instrument and the communication device were separate – my plastic credit card and my Verizon smartphone. In the next couple of years, these two will converge, as my payment instrument and my smartphone will become one. At that point, will the card issuer continue to send me text alerts asking for confirmation? If instead of my wallet, my phone was stolen – what good will a text alert to that phone be of any use to prevent the re-occurrence of fraud? Further if one card was shut down, the thief could move to other cards with in the wallet – if, just as today, there are no frameworks for fraud warnings to permeate across other cards with in the wallet. Further, fraud liability is about to shift to the merchant with the 2013 EMV Mandate. In the recent years, there has been significant innovation in payments – to the extent that we have a number of OTT (Over the Top) players, unencumbered by regulation, who has been able to sidestep existing players – issuers and card networks, in positioning mobile as the next stage in the evolution of payments. Google, PayPal, Square, Isis (a Carrier consortium formed by Verizon, T-Mobile and AT&T), and a number of others have competing solutions vying for customer mind share in this emerging space. But when it comes to security, they all revert to a 4 digit PIN – what I call as the proverbial fig leaf in security. Here we have a device that offers a real-time context – whether it be temporal, social or geo-spatial – all inherently valuable in determining customer intent and fraud, and yet we feel its adequate to stay with the PIN, a relic as old as the payment rails these newer solutions are attempting to displace. Imagine what could have been – in the previous scenario where instead of reaching for my card, I reach for my mobile wallet. Upon launching it, the wallet, leveraging the device context, determines that it is thousands of miles away from the customer’s home and should score the fraud risk and appropriately ask the customer to answer one or more “out-of-wallet” questions that must be correctly answered. If the customer fails, or prefers not to, the wallet can suggest alternate ways to authenticate – including IVR. Based on the likelihood of fraud, the challenge/response scenario could include questions about open trade lines or simply the color of her car. Will the customer appreciate this level of pro-activeness on the issuer’s part to verify the legality of the transaction? Absolutely. Merchants, who so far has been on the sidelines of the mobile payment euphoria, but for whom fraud is a real issue affecting their bottom-line, will also see the value. The race to mobile payments has been all about quickly shifting spend from plastic to mobile, and incenting that by enabling smartphones to store and deliver loyalty cards and coupons. The focus need to shift, or to include, how smartphones can be leveraged to address and reduce fraud at the point-of-sale – by bringing together context of the device and a real-time channel for multi-factor authentication. It’s relevant to talk about Google Wallet (in its revised form) and Fraud in this context. Issuers have been up in arms privately and publicly, in how Google displaces the issuer from the transaction by inserting itself in the middle and settles with the merchant prior to firing off an authorization request to the issuer on the merchant’s behalf. Issuers are worried that this could wreak havoc with their inbuilt fraud measures as the authorization request will be masked by Google and could potentially result in issuer failing to catch fraudulent transactions. Google has been assuaging issuer’s fears on this front, but has yet to offer something substantial – as it clearly does not intent to revert to where it was prior – having no visibility in to the payment transaction (read my post here). This is clearly shaping up to be an interesting showdown – would issuers start declining transactions where Google is the merchant of record? And how much more risk is Google willing to take, to become the entity in the middle? This content is a re-post from Cherian's personal blog: http://www.droplabs.co/?p=625

Published: October 21, 2012 by Cherian Abraham

By: Kyle Aiman Let’s face it, debt collectors often get a bad rap.  Sure, some of it is deserved, but the majority of the nation’s estimated 157,000 collectors strive to do their job in a way that will satisfy both their employer and the debtor.  One way to improve collector/debtor interaction is for the collector to be trained in consumer credit and counseling. In a recent article published on Collectionsandcreditrisk.com, Trevor Carone, Vice President of Portfolio and Collection Solutions at Experian, explored the concept of using credit education to help debt collectors function more like advisors instead of accusers.  If collectors gain a better understanding of consumer credit – how to read a credit report, how items may affect a credit score, how a credit score is compiled and what factors influence the score – perhaps they can offer suggestions for improvement. Will providing past-due consumers with a plan to help improve their credit increase payments?  Read the article and let us know what you think!

Published: October 10, 2012 by Guest Contributor

By: Kyle Aiman For more than 20 years, creditors have been using scores in their lending operations.  They use risk models such as the VantageScore® credit score, FICO or others to predict what kind of risk to expect before making credit-granting decisions. Risk models like these do a great job of separating the “goods” from the “bads.” Debt recovery models are built differently-their job is to predict who is likely to pay once they have already become delinquent. While recovery models have not been around as long as risk models, recent improvements in analytics are producing great results.  In fact, the latest generation of recovery models can even predict who will pay the most. Hopefully, you are not using a risk model in your debt collection operations.  If you are, or if you are not using a model at all, here are five reasons to start using a recovery model: Increase debt recovery rates – Segmenting and prioritizing your portfolios will help increase recovery rates by allowing you to place emphasis on those accounts most likely to pay. Manage and reduce debt recovery costs – Develop treatment strategies of varying costs and apply appropriately. Do not waste time and money on uncollectible accounts. Outsource accounts to third party collection agencies – If you use outside agencies, use recovery scoring to identify accounts best suited for assignment; take the cream off the top to keep in house. Send accounts to legal – Identify accounts that would be better served using a legal strategy versus spending time and money using traditional treatments. Price accounts appropriately for sale – If you are in a position to sell accounts, recovery scoring can help you develop a pricing strategy based on expected collectibility. What recovery scoring tools are you using to optimize your company's debt collection efforts? Feel free to ask questions or share your thoughts below.   VantageScore® is a registered trademark of VantageScore Solutions, LLC.

Published: September 10, 2012 by Guest Contributor

By: Uzma Aziz They say, “a bird in the hand is better than two in the bush” …and the same can be said about customers in a portfolio. Studies have shown time and again that the cost of acquiring a new financial services customer is many times higher than the cost of keeping an existing one. Retention has always been an integral part of portfolio management, and with the market finally on an upward trajectory, there is all the more need to hold on to profitable customers. Experts at CEB TowerGroup are forecasting a combined annual growth rate of over 12% for new credit cards alone through 2015. Combine that with a growing market with better-informed and savvy customers, and you have a very good reason to be diligent about retaining your best ones. Also, different sized institutions have varying degrees of success. According to a study by J.D. Power & Associates, in 2011 overall, 9.6% of customers indicated they switched their primary bank account during the past year, up from 8.7% a year ago. Smaller banks and credit unions did see drastically lower attrition than they did in prior years: just 0.9% on average, down from 8.8% a year earlier. For large, mid-sized and regional banks unfortunately, it was a different story with attrition rates at 10 to 11.3%. It gets even more complex when you drill down to a specific type of financial product such as a credit card. Experian’s own analysis of credit card customer retention shows that while the majority of customers are loyal, a good percentage attrite actively—that is, close their accounts and open new ones—while a bigger percent are silent attriters, those that do not close accounts but pay down balances and move their spend to others. Obviously, attrition is a continual topic that needs to be addressed, but to minimize it you first need to understand the root cause. Poor service seems to be the leading factor and one study* showed that 31% of consumers who switched banks did so because of poor service, followed by product features and finding a better offer elsewhere. So what are financial institutions doing to retain their profitable customers? There are lots of tools ranging from easy to more complex e.g., fee and interest waiver, line increases, rewards, and call center priority to name a few. But the key to successful customer retention is to look within the portfolio combining both internal and external information. This encompasses both proactive and reactive strategies. Proactive strategies include identifying customer behaviors which lead to balance or account attrition and taking action before a customer does. This includes monitoring changes over time and identifying thresholds for action as well as segmentation and modeling to identify problem. Reactive strategies, as the name suggests, is reacting to when a customer has already taken action which will lead to attrition; these include monitoring portfolios for new inquiries and account openings or response to customer complaints. In some cases, this maybe too little too late, but in others reactive response may be what saves a customer relationship. Whichever strategy or combination of these you choose, the key points to remember to retain customers and keep them happy are: Understand your current customers’ perceptions about credit, as they many have changed—customers are likely to be more educated, and the most profitable ones expect only the best customer service experience Be approachable and personal – meet customer needs—or better yet, anticipate those needs, focusing on loyalty and customer experience You don’t need to “give away the farm” – sometimes a partial fee waiver works * Global Consumer Banking Survey 2011, by Ernst & Young  

Published: August 20, 2012 by Guest Contributor

By: Stacy Schulman Earlier this week the CFPB announced a final rule addressing its role in supervising certain credit reporting agencies, including Experian and others that are large market participants in the industry. To view this original content, Experian and the CFPB - Both Committed to Helping Consumers. During a field hearing in Detroit, CFPB Director Richard Cordray’s spoke about a new regulatory focus on the accuracy of the information received by the credit reporting companies, the role they play in assembling and maintaining that information, and the process available to consumers for correcting errors. We look forward to working with CFPB on these important priorities. To read more about how Experian prioritizes these information essentials for consumers, clients and shareholders, read more on the Experian News blog. Learn more about Experian's view of the Consumer Financial Protection Bureau. ___________________ Original content provided by: Tony Hadley, Senior Vice President of Government Affairs and Public Policy About Tony: Tony Hadley is Senior Vice President of Government Affairs and Public Policy for Experian. He leads the corporation’s legislative, regulatory and policy programs relating to consumer reporting, consumer finance, direct and digital marketing, e-commerce, financial education and data protection. Hadley leads Experian’s legislative and regulatory efforts with a number of trade groups and alliances, including the American Financial Services Association, the Direct Marketing Association, the Consumer Data Industry Association, the U.S. Chamber of Commerce and the Interactive Advertising Bureau. Hadley is Chairman of the National Business Coalition on E-commerce and Privacy.

Published: July 18, 2012 by Guest Contributor

Previously, we looked at the various ways a dual score strategy could help you focus in on an appropriate lending population. Find your mail-to population with a prospecting score on top of a risk score; locate the riskiest of all consumers by layering a bankruptcy score with your risk model. But other than multiple scores, what other tools can be used to improve credit scoring effectiveness? Credit attributes add additional layers of insight from a risk perspective. Not everyone who scores an 850 represent the same level of risk once you start interrogating their broader profile. How much total debt are they carrying? What is the nature of it - is it mortgage or mostly revolving? A credit score may not fully articulate a consumer as high risk, but if their debt obligations are high, they may represent a very different type of risk than from another consumer with the same 850 score.  Think of attribute overlays in terms of tuning the final score valuation of an individual consumer by making the credit profile more transparent, allowing a lender to see more than just the risk odds associated with the initial score. Attributes can also help you refine offers. A consumer may be right for you in terms of risk, but are you right for them? If they have 4 credit cards with $20K limits each, they’re likely going to toss your $5K card offer in the trash. Attributes can tell us these things, and more. For example, while a risk score can tell us what the risk of a consumer is within a set window, certain credit attributes can tell us something about the stability of that consumer to remain within that risk band. Recent trends in score migration – the change in a level of creditworthiness of a consumer subsequent to generation of a current credit score – can undermine the most conservative of risk management policies. At the height of the recession, VantageScore® Solutions LLC studied the migration of scores across all risk bands and was able to identify certain financial management behaviors found within their credit files. These behaviors (signaling, credit footprint, and utility) assess the consumer’s likelihood of improving, significantly deteriorating, or maintaining a stable score over the next 12 months.  Knowing which subgroup of your low-risk population is deteriorating, or which high risk groups are improving, can help you make better decision today.

Published: June 12, 2012 by Veronica Herrera

One of the most successful best practices for improving agency performance is the use of scorecards for assessing and rank ordering performance of agencies in competition with each other. Much like people, agencies thrive when they understand how they are evaluated, how to influence those factors that contribute to success, and the recognition and reward for top tier performance. Rather than a simple view of performance based upon a recovery rate as a percentage of total inventory, best practice suggests that performance is more accurately reflected in vintage batch liquidation and peer group comparisons to the liquidation curve. Why? In a nutshell, differences in inventory aging and the liquidation curve. Let’s explain this in greater detail. Historically, collection agencies would provide their clients with better performance reporting than their clients had available to them. Clients would know how much business was placed in aggregate, but not by specific vintage relating to the month or year of placement. Thus, when a monthly remittance was received, the client would be incapable of understanding whether this month’s recoveries were from accounts placed last month, this year, or three years ago. This made forecasting of future cash flows from recoveries difficult, in that you would have no insight into where the funds were coming from. We know that as a charged off debt ages, its future liquidation rate generally downward sloping (the exception is auto finance debt, as there is a delay between the time of charge-off and rehabilitation of the debtor, thus future flows are higher beyond the 12-24 month timeframe). How would you know how to predict future cash flows and liquidation rates without understanding the different vintages in the overall charged off population available for recovery? This lack of visibility into liquidation performance created another issue. How do you compare the performance of two different agencies without understanding the age of the inventory and how it is liquidating? An as example, let’s assume that Agency A has been handling your recovery placements for a few years, and has an inventory of $10,000,000 that spans 3+ years, of which $1,500,000 has been placed this year. We know from experience that placements from 3 years ago experienced their highest liquidation rate earlier in their lifecycle, and the remaining inventory from those early vintages are uncollectible or almost full liquidated. Agency A remits $130,000 this month, for a recovery rate of 1.3%. Agency B is a new agency just signed on this year, and has an inventory of $2,000,000 assigned to them. Agency B remits $150,000 this month, for a recovery rate of 7.5%. So, you might assume that Agency B outperformed Agency A by a whopping 6.2%. Right? Er … no. Here’s why. If we had better visibility of Agency A’s inventory, and from where their remittance of $130,000 was derived, we would have known that only a couple of small insignificant payments came from the older vintages of the $10,000,000 inventory, and that of the $130,000 remitted, over $120,000 came from current year inventory (the $1,500,000 in current year placements). Thus, when analyzed in context with a vintage batch liquidation basis, Agency A collected $120,000 against inventory placed in the current year, for a liquidation rate of 8.0%. The remaining remittance of $10,000 was derived from prior years’ inventory. So, when we compare Agency A with current year placements inventory of $1,500,000 and a recovery rate against those placements of 8.0% ($120,000) versus Agency B, with current year placements inventory of $2,000,000 and a recovery rate of 7.5% ($150,000), it’s clear that Agency A outperformed Agency B. This is why the vintage batch liquidation model is the clear-cut best practice for analysis and MI. By using a vintage batch liquidation model and analyzing performance against monthly batches, you can begin to interpret and define the liquidation curve. A liquidation curve plots monthly liquidation rates against a specific vintage, usually by month, and typically looks like this: Exhibit 1: Liquidation Curve Analysis                           Note that in Exhibit 1, the monthly liquidation rate as a percentage of the total vintage batch inventory appears on the y-axis, and the month of funds received appears on the x-axis. Thus, for each of the three vintage batches, we can track the monthly liquidation rates for each batch from its initial placement throughout the recovery lifecycle. Future monthly cash flow for each discrete vintage can be forecasted based upon past performance, and then aggregated to create a future recovery projection. The most sophisticated and up to date collections technology platforms, including Experian’s Tallyman™ and Tallyman Agency Management™ solutions provide vintage batch or laddered reporting. These reports can then be used to create scorecards for comparing and weighing performance results of competing agencies for market share competition and performance management. Scorecards As we develop an understanding of liquidation rates using the vintage batch liquidation curve example, we see the obvious opportunity to reward performance based upon targeted liquidation performance in time series from initial placement batch. Agencies have different strategies for managing client placements and balancing clients’ liquidation goals with agency profitability. The more aggressive the collections process aimed at creating cash flow, the greater the costs. Agencies understand the concept of unit yield and profitability; they seek to maximize the collection result at the lowest possible cost to create profitability. Thus, agencies will “job slope” clients’ projects to ensure that as the collectability of the placement is lower (driven by balance size, customer credit score, date of last payment, phone number availability, type of receivable, etc.) For utility companies and other credit grantors with smaller balance receivables, this presents a greater problem, as smaller balances create smaller unit yield. Job sloping involves reducing the frequency of collection efforts, employing lower cost collectors to perform some of the collection efforts, and where applicable, engaging offshore resources at lower cost to perform collection efforts. You can often see the impact of various collection strategies by comparing agency performance in monthly intervals from batch placement. Again, using a vintage batch placement analysis, we track performance of monthly batch placements assigned to competing agencies. We compare the liquidation results on these specific batches in monthly intervals, up until the receivables are recalled. Typical patterns emerge from this analysis that inform you of the collection strategy differences. Let’s look at an example of differences across agencies and how these strategy differences can have an impact on liquidation:                     As we examine the results across both the first and second 30-day phases, we are likely to find that Agency Y performed the highest of the three agencies, with the highest collection costs and its impact on profitability. Their collection effort was the most uniform over the two 30-day segments, using the dialer at 3-day intervals in the first 30-day segment, and then using a balance segmentation scheme to differentiate treatment at 2-day or 4-day intervals throughout the second 30-day phase. Their liquidation results would be the strongest in that liquidation rates would be sustained into the second 30-day interval. Agency X would likely come in third place in the first 30-day phase, due to a 14-day delay strategy followed by two outbound dialer calls at 5-day intervals. They would have a better performance in the second 30-day phase due to the tighter 4-day intervals for dialing, likely moving into second place in that phase, albeit at higher collection costs for them. Agency Z would come out of the gates in the first 30-day phase in first place, due to an aggressive daily dialing strategy, and their takeoff and early liquidation rate would seem to suggest top tier performance. However, in the second 30-day phase, their liquidation rate would fall off significantly due to the use of a less expensive IVR strategy, negating the gains from the first phase, and potentially reducing their over position over the two 30-day segments versus their peers. The point is that with a vintage batch liquidation analysis, we can isolate performance of a specific placement across multiple phases / months of collection efforts, without having that performance insight obscured by new business blended into the analysis. Had we used the more traditional current month remittance over inventory value, Agency Z might be put into a more favorable light, as each month, they collect new paper aggressively and generate strong liquidation results competitively, but then virtually stop collecting against non-responders, thus “creaming” the paper in the first phase and leaving a lot on the table. That said, how do we ensure that an Agency Z is not rewarded with market share? Using the vintage batch liquidation analysis, we develop a scorecard that weights the placement across the entire placement batch lifecycle, and summarizes points in each 30-day phase. To read Jeff's related posts on the topic of agency management, check out: Vendor auditing best practices that will help your organization succeed Agency managment, vendor scorecards, auditing and quality monitoring  

Published: April 25, 2012 by Guest Contributor

Subscribe to our blog

Enter your name and email for the latest updates.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe to our Experian Insights blog

Don't miss out on the latest industry trends and insights!
Subscribe