TL;DR Read within as to how Touch ID is made possible via ARM’s TrustZone/TEE, and why this matters in the context of the coming Apple’s identity framework. Also I explain why primary/co-processor combos are here to stay. I believe that eventually, Touch ID has a payments angle – but focusing on e-commerce before retail. Carriers will weep over a lost opportunity while through Touch ID, we have front row seats to Apple’s enterprise strategy, its payment strategy and beyond all – the future direction of its computing platform. I had shared my take on a possible Apple Biometric solution during the Jan of this year based on its Authentec acquisition. I came pretty close, except for the suggestion that NFC is likely to be included. (Sigh.) Its a bit early to play fast and loose with Apple predictions, but its Authentec acquisition should rear its head sometime in the near future (2013 – considering Apple’s manufacturing lead times), that a biometric solution packaged neatly with an NFC chip and secure element could address three factors that has held back customer adoption of biometrics: Ubiquity of readers, Issues around secure local storage and retrieval of biometric data, Standardization in accessing and communicating said data. An on-chip secure solution to store biometric data – in the phone’s secure element can address qualms around a central database of biometric data open to all sorts of malicious attacks. Standard methods to store and retrieve credentials stored in the SE will apply here as well. Why didn’t Apple open up Touch ID to third party dev? Apple expects a short bumpy climb ahead for Touch ID before it stabilizes, as early users begin to use it. By keeping its use limited to authenticating to the device, and to iTunes – it can tightly control the potential issues as they arise. If Touch ID launched with third party apps and were buggy, it’s likely that customers will be confused where to report issues and who to blame. That’s not to say that it won’t open up Touch ID outside of Apple. I believe it will provide fettered access based on the type of app and the type of action that follows user authentication. Banking, Payment, Productivity, Social sharing and Shopping apps should come first. Your fart apps? Probably never. Apple could also allow users to set their preferences (for app categories, based on user’s current location etc.) such that biometrics is how one authenticates for transactions with risk vs not requiring it. If you are at home and buying an app for a buck – don’t ask to authenticate. But if you were initiating a money transfer – then you would. Even better – pair biometrics with your pin for better security. Chip and Pin? So passé. Digital Signatures, iPads and the DRM 2.0: It won’t be long before an iPad shows up in the wild sporting Touch ID. And with Blackberry’s much awaited and celebrated demise in the enterprise, Apple will be waiting on the sidelines – now with capabilities that allow digital signatures to become ubiquitous and simple – on email, contracts or anything worth putting a signature on. Apple has already made its iWork productivity apps(Pages, Numbers, Keynote), iMovie and iPhoto free for new iOS devices activated w/ iOS7. Apple, with a core fan base that includes photographers, designers and other creative types, can now further enable iPads and iPhones to become content creation devices, with the ability to attribute any digital content back to its creator by a set of biometric keys. Imagine a new way to digitally create and sign content, to freely share, without worrying about attribution. Further Apple’s existing DRM frameworks are strengthened with the ability to tag digital content that you download with your own set of biometric keys. Forget disallowing sharing content – Apple now has a way to create a secondary marketplace for its customers to resell or loan digital content, and drive incremental revenue for itself and content owners. Conclaves blowing smoke: In a day and age where we forego the device for storing credentials – whether it be due to convenience or ease of implementation – Apple opted for an on-device answer for where to store user’s biometric keys. There is a reason why it opted to do so – other than the obvious brouhaha that would have resulted if it chose to store these keys on the cloud. Keys inside the device. Signed content on the cloud. Best of both worlds. Biometric keys need to be held locally, so that authentication requires no roundtrip and therefore imposes no latency. Apple would have chosen local storage (ARM’s SecurCore) as a matter of customer experience, and what would happen if the customer was out-of-pocket with no internet access. There is also the obvious question that a centralized biometric keystore will be on the crosshairs of every malicious entity. By decentralizing it, Apple made it infinitely more difficult to scale an attack or potential vulnerability. More than the A7, the trojan in Apple’s announcement was the M7 chip – referred to as the motion co-processor. I believe the M7 chip does more than just measuring motion data. M7 – A security co-processor? I am positing that Apple is using ARM’s TrustZone foundation and it may be using the A7 or the new M7 co-processor for storing these keys and handling the secure backend processing required. Horace Dediu of Asymco had called to question why Apple had opted for M7 and suggested that it may have a yet un-stated use. I believe M7 is not just a motion co-processor, it is also a security co-processor. I am guessing M7 is based on the Cortex-M series processors and offloads much of this secure backend logic from the primary A7 processor and it may be that the keys themselves are likely to be stored here on M7. The Cortex-M4 chip has capabilities that sound very similar to what Apple announced around M7 – such as very low power chip, that is built to integrate sensor output and wake up only when something interesting happens. We should know soon. This type of combo – splitting functions to be offloaded to different cores, allows each cores to focus on the function that it’s supposed to performed. I suspect Android will not be far behind in its adoption, where each core focuses on one or more specific layers of the Android software stack. Back at Google I/O 2013, it had announced 3 new APIs (the Fused location provider) that enables location tracking without the traditional heavy battery consumption. Looks to me that Android decoupled it so that we will see processor cores that focus on these functions specifically – soon. I am fairly confident that Apple has opted for ARM’s Trustzone/TEE. Implementation details of the Trustzone are proprietary and therefore not public. Apple could have made revisions to the A7 chip spec and could have co-opted its own. But using the Trustzone/TEE and SecurCore allows Apple to adopt existing standards around accessing and communicating biometric data. Apple is fully aware of the need to mature iOS as a trusted enterprise computing platform – to address the lack of low-end x86 devices that has a hardware security platform tech. And this is a significant step towards that future. What does Touch ID mean to Payments? Apple plans for Touch ID kicks off with iTunes purchase authorizations. Beyond that, as iTunes continue to grow in to a media store behemoth – Touch ID has the potential to drive fraud risk down for Apple – and to further allow it to drive down risk as it batches up payment transactions to reduce interchange exposure. It’s quite likely that à la Walmart, Apple has negotiated rate reductions – but now they can assume more risk on the front-end because they are able to vouch for the authenticity of these transactions. As they say – customer can longer claim the fifth on those late-night weekend drunken purchase binges. Along with payment aggregation, or via iTunes gift cards – Apple has now another mechanism to reduce its interchange and risk exposure. Now – imagine if Apple were to extend this capability beyond iTunes purchases – and allow app developers to process in-app purchases of physical goods or real-world experiences through iTunes in return for better blended rates? (instead of Paypal’s 4% + $0.30). Heck, Apple can opt for short-term lending if they are able to effectively answer the question of identity – as they can with Touch ID. It’s Paypal’s ‘Bill Me Later’ on steroids. Effectively, a company like Apple who has seriously toyed with the idea of a Software-SIM and a “real-time wireless provider marketplace” where carriers bid against each other to provide you voice, messaging and data access for the day – and your phone picks the most optimal carrier, how far is that notion from picking the cheapest rate across networks for funneling your payment transactions? Based on the level of authentication provided or other known attributes – such as merchant type, location, fraud risk, customer payment history – iTunes can select across a variety of payment options to pick the one that is optimal for the app developer and for itself. And finally, who had the most to lose with Apple’s Touch ID? Carriers. I wrote about this before as well, here’s what I wrote then (edited for brevity): Does it mean that Carriers have no meaningful role to play in commerce? Au contraire. They do. But its around fraud and authentication. Its around Identity. … But they seem to be stuck imitating Google in figuring out a play at the front end of the purchase funnel, to become a consumer brand(Isis). The last thing they want to do is leave it to Apple to figure out the “Identity management” question, which the latter seems best equipped to answer by way of scale, the control it exerts in the ecosystem, its vertical integration strategy that allows it to fold in biometrics meaningfully in to its lineup, and to start with its own services to offer customer value. So there had to have been much ‘weeping and moaning and gnashing of the teeth’ on the Carrier fronts with this launch. Carriers have been so focused on carving out a place in payments, that they lost track of what’s important – that once you have solved authentication, payments is nothing but accounting. I didn’t say that. Ross Anderson of Kansas City Fed did. What about NFC? I don’t have a bloody clue. Maybe iPhone6? iPhone This is a re-post from Cherian's original blog post "Smoke is rising from Apple's Conclave"
By: Matt Sifferlen I recently read interesting articles on the Knowledge@Wharton and CNNMoney sites covering the land grab that's taking place among financial services startups that are trying to use a consumer's social media activity and data to make lending decisions. Each of these companies are looking at ways to take the mountains of social media data that sites such as Twitter, Facebook, and LinkedIn generate in order to create new and improved algorithms that will help lenders target potential creditworthy individuals. What are they looking at specifically? Some criteria could be: History of typing in ALL CAPS or all lower case letters Frequent usage of inappropriate comments Number of senior level connections on LinkedIn The quantity of posts containing cats or annoying self-portraits (aka "selfies") Okay, I made that last one up. The point is that these companies are scouring through the data that individuals are creating on social sites and trying to find useful ways to slice and dice it in order to evaluate and target consumers better. On the consumer banking side of the house, there are benefits for tracking down individuals for marketing and collections purposes. A simple search could yield a person's Facebook, Twitter, or LinkedIn profile. The behaviorial information can then be leveraged as a part of more targeted multi-channel and contact strategies. On the commercial banking side, utilizing social site info can help to supplement any traditional underwriting practices. Reviewing the history of a company's reviews on Yelp or Angie's List could share some insight into how a business is perceived and reveal whether there is any meaningful trend in the level of negative feedback being posted or potential growth outlook of the company. There are some challenges involved with leveraging social media data for these purposes. 1. Easily manipulated information 2. Irrelevant information that doesn't represent actual likes, thoughts or relevant behaviors 3. Regulations From a Fraud perspective, most online information can easily and frequently be manipulated which can create a constantly moving target for these providers to monitor and link to the right customer. Fake Facebook and Twitter pages, false connections and referrals on LinkedIn, and fabricated positive online reviews of a business can all be accomplished in a matter of minutes. And commercial fraudsters are likely creating false business social media accounts today for shelf company fraud schemes that they plan on hatching months or years down the road. As B2B review websites continue to make it easier to get customers signed up to use their services, the downside is there will be even more unusable information being created since there are less and less hurdles for commercial fraudsters to clear, particularly for sites that offer their services for free. For now, the larger lenders are more likely to utilize alternative data sources that are third party validated, like rent and utility payment histories, while continuing to rely on tools that can prevent against fraud schemes. It will be interesting to see what new credit and non credit data will be utilized as a common practice in the future as lenders continue their efforts to find more useful data to power their credit and marketing decisions.
By: Joel Pruis As we go through the economic seasons, we need to remember to reassess our strategy. While we use data as the way to accurately assess the environment and determine the best course of action for your future strategy, the one thing that is for certain is that the current environment will definitely change. Aspects that we did not anticipate will develop, trends may start to slow or change direction. Moneyball continues to be a movie that gives us some great examples. We see that Billy Beane and Peter Brand were constantly looking at their position and making adjustments to the team’s roster. Even before they made any significant adjustments, Beane and Brand found themselves justifying their strategy to the owner (even though the primary issue was with the head coach not playing the roster that maximized the team’s probability of winning). The first aspect that worked against the strategy was the head coach and while we could go down a tangent about cultural battles within an organization, let's focus on how Beane adjusted. Beane simply traded the players the head coach preferred to play forcing the use of players preferred by Beane and Brand. Later we see Beane and Brand making final adjustments to the roster by negotiating trades resulting in the Oakland A’s landing Ricardo Rincon. The change in the league that allowed such a trade was that Rincon’s team was not doing well and the timing allowed the A’s to execute the trade. Beane adjusted with the changes in the league. One thing to note, is that he changed the roster while the team was doing well. They were winning but Beane made adjustments to continue maximizing the team’s potential. Too often we adjust when things are going poorly and do not adjust when we seem to be hitting our targets. Overall, we need to continually assess what has changed in our environment and determine what new challenges or new opportunities these changes present. I encourage you to regularly assess what is happening in your local economy. High-level national trends are constantly on the front page of the news but we need to drill down to see what is happening in a specific market area being served. As Billy Beane did with the Oakland A’s throughout the season, I challenge you to assess your current strategies and execution against what is happening in your market territory. Related posts: How Financial Institutions can assess the overall conditions for generating the net yield on the assets How to create decision strategies for small business lending Upcoming Webinar: Learn about the current state of small business, the economy and how it applies to you
If you're looking to implement and deploy a knowledge-based authentication (KBA) solution in your application process for your online and mobile customer acquisition channels - then, I have good news for you! Here’s some of the upside you’ll see right away: Revenues (remember, the primary activities of your business?) will accelerate up Your B2C acceptance or approval rates will go up thru automation Manual review of customer applications will go down and that translates to a reduction in your business operation costs Products will be sold and shipped faster if you’re in the retail business, so you can recognize the sales revenue or net sales quicker Your customers will appreciate the fact that they can do business in minutes vs. going thru a lengthy application approval process with turnaround times of days to weeks And last but not least, your losses due to fraud will go down To keep you informed about what’s relevant when choosing a KBA vendor, here’s what separates the good KBA providers from the bad: The underlying data used to create questions should be from multiple data sources and should vary in the type of data, for example credit and non-credit Relying on public record data sources is becoming a risky proposition given recent adoption of various social media websites and various public record websites Have technology that will allow you to create a custom KBA setup that is unique to your business and business customers, and the proven support structure to help you grow your business safely Provide consulting (performance monitoring)and analytical support that will keep you ahead of the fraudsters trying to game your online environment by assuring your KBA tool is performing at optimal levels Solutions that can easily interface with multiple systems, and assist from a customer experience perspective. How are your peers in the following 3 industries doing at adopting a KBA strategy to help grow and protect their businesses? E-commerce 21% use KBA today and are satisfied with the results* 13% have KBA on roadmap and the list is growing fast* Healthcare 20% use dynamic KBA* Financial Institutions 30% combination of dynamic & static KBA* 20% dynamic KBA* What are the typical uses of KBA?* Call center Web / mobile verification Enrollment ID verification Provider authentication Eligibility *According to a 2012 report on knowledge-based authentication by Aite Group LLC Knowledge-based authentication, commonly referred to as KBA, is a method of authentication which seeks to prove the identity of someone accessing a service, such as a website. As the name suggests, KBA requires the knowledge of personal information of the individual to grant access to the protected material. There are two types of KBA: "static KBA", which is based on a pre-agreed set of "shared secrets"; and "dynamic KBA", which is based on questions generated from a wider base of personal information.
There are two core fundamentals of evaluating loan loss performance to consider when generating organic portfolio growth through the setting of customer lending limits. Neither of which can be discussed without first considering what defines a “customer.” Definition of a customer The approach used to define a customer is critical for successful customer management and is directly correlated to how joint accounts are managed. Definitions may vary by how joint accounts are allocated and used in risk evaluation. It is important to acknowledge: Legal restrictions for data usage related to joint account holders throughout the relationship Impact on predictive model performance and reporting where there are two financially linked individuals with differently assigned exposures Complexities of multiple relationships with customers within the same household – consumer and small business Typical customer definitions used by financial services organizations: Checking account holders: This definition groups together accounts that are “fed” by the same checking account. If an individual holds two checking accounts, then she will be treated as two different and unique customers. Physical persons: Joint accounts allocated to each individual. If Mr. Jones has sole accounts and holds joint accounts with Ms. Smith who also has sole accounts, the joint accounts would be allocated to both Mr. Jones and Ms. Smith. Consistent entities: If Mr Jones has sole accounts and holds joint accounts with Ms. Smith who also has sole accounts, then 3 “customers” are defined: Jones, Jones & Smith, Smith. Financially-linked individuals: Whereas consistent entities are considered three separate customers, financially-linked individuals would be considered one customer: “Mr. Jones & Ms. Smith”. When multiple and complex relationships exist, taking a pragmatic approach to define your customers as financially-linked will lead to a better evaluation of predicted loan performance. Evaluation of credit and default risk Most financial institutions calculate a loan default probability on a periodic basis (monthly) for existing loans, in the format of either a custom behavior score or a generic risk score, supplied by a credit bureau. For new loan requests, financial institutions often calculate an application risk score, sometimes used in conjunction with a credit bureau score, often in a matrix-based decision. This approach is challenging for new credit requests where the presence and nature of the existing relationship is not factored into the decision. In most cases, customers with existing relationships are treated in an identical manner to those new applicants with no relationship – the power and value of the organization’s internal data goes overlooked whereby customer satisfaction and profits suffer as a result. One way to overcome this challenge is to use a Strength of Relationship (SOR) indicator. Strength of Relationship (SOR) indicator The Strength of Relationship (SOR) indicator is a single-digit value used to define the nature of the relationship of the customer with financial institution. Traditional approaches for the assignment of a SOR are based upon the following factors Existence of a primary banking relationship (salary deposits) Number of transactional products held (DDA, credit cards) Volume of transactions Number of loan products held Length of time with bank The SOR has a critical role in the calculation of customer level risk grades and strategies and is used to point us to the data that will be the most predictive for each customer. Typically the stronger the relationship, the more we know about our customer, and the more robust will be predictive models of consumer behavior. The more information we have on our customer, the more our models will lean towards internal data as the primary source. For weaker relationships, internal data may not be robust enough alone to be used to calculate customer level limits and there will be a greater dependency to augment internal data with external third party data (credit bureau attributes.) As such, the SOR can be used as a tool to select the type and frequency of external data purchase. Customer Risk Grade (CRG) A customer-level risk grade or behavior score is a periodic (monthly) statistical assessment of the default risk of an existing customer. This probability uses the assumption that past performance is the best possible indicator of future performance. The predictive model is calibrated to provide the probability (or odds) that an individual will incur a “default” on one or more of their accounts. The customer risk grade requires a common definition of a customer across the enterprise. This is required to establish a methodology for treating joint accounts. A unique customer reference number is assigned to those customers defined as “financially-linked individuals”. Account behavior is aggregated on a monthly basis and this information is subsequently combined with information from savings accounts and third party sources to formulate our customer view. Using historical customer information, the behavior score can accurately differentiate between good and bad credit risk individuals. The behavior score is often translated into a Customer Risk Grade (CRG). The purpose of the CRG is to simplify the behavior score for operational purposes making it easier for noncredit/ risk individuals to interpret a grade more easily than a mathematical probability. Different methods for evaluating credit risk will yield different results and an important aspect in the setting of customer exposure thresholds is the ability to perform analytical tests of different strategies in a controlled environment. In my next post, I’ll dive deeper into adaptive control, champion challenger techniques and strategy design fundamentals. Related content: White paper: Improving decisions across the Customer Life Cycle
By: Joel Pruis So we know we need to determine the overall net yield on assets required to cover the cost of funds and the operating expenses but how? In the movie Moneyball, the Oakland A’s develop a strategy to win 99 games by scoring 814 runs and only allowing 645 runs by the opposition. In order to generate the necessary runs, Peter Brand boils down all the stats into one number, on base percentage. By looking at the on-base percentage of all the players in the league, Brand is able to determine the likelihood of generating runs. There are a few key phrases/quotes from this scene that need to be highlighted: “it’s about getting things down to one number” “People are overlooked for a variety of biased reasons and ‘perceived’ flaws.” “Bill James and mathematics cut straight through that [biased reasons and perceived flaws].” Getting things down to one number is the liberating element for the Oakland A’s and for banking. We have already identified the one number for banking – Net Yield on Assets. Let’s define this a bit further though. For this exercise, net yield means the gross yield (interest income plus fee income) on assets less charge offs. We are looking to see what is going to be the consistent return on the assets less what can be expected net charge off related to the assets. When Billy Beane and Peter Brand got it down to the one number “On Base Percentage” it altered the player selection process and highlighted the biases of the scouts such as: Giambi’s brother was “getting a little thick around the waist” “Old Man” Justice Justice will be “lucky if he hits his weight” in July and August Justice’s “legs are gone Hatteberg “can’t throw” Hatteberg’s “best part of his career is over” Hatteberg “walks a lot” None of the above comments used any facts or data to disprove each player’s on base percentage. Can you imagine if they were underwriters or lenders? What type of compliance issues would we have on our hands with the above comments? Biased against disabilities (Hatteberg with nerve damage); Age Discrimination (“Old Man” Justice), Physical Appearance (Giambi’s brother “getting a little thick around the waist”), these scouts would be a compliance liability let alone obstacles in any type of organizational change. But one can readily see how focusing on one number liberates the thinking and removes the old constraints or ways of thinking. One of the scouts commented that Hatteberg had a high on base percentage because he walks a lot, considering a walk as a negative while a hit is a positive but why? Why is getting on base by being walked a negative but getting on base with a hit is positive? The result is the same as the movie points out. How about in commercial lending? If we focus on net yield on the portfolio as the one number, does that do anything to remove biases? I believe that it does. One example is the perception of charge offs in a portfolio. To this day the notion of a charge off in a commercial portfolio, even in the small business portfolio, is frowned upon and can jeopardize one’s career. Similar to the walk, the charge off is not desired but if we focus on the one number, net yield, it actually removes the stigma of the charge off! If we need at minimum a 6% net asset yield and we are able to generate a gross yield of 9% with an expected loss rate of 2%, we actually exceed our “one number” of a targeted net yield of 6% with an expected net yield of 7%. With that change that removes the biases and flawed perception, can we now start to find opportunities that provide us with the ability to step away from the norm; stop competing with the rest; and generate that higher return that is required? What are the potential biases and flawed perceptions that will need to be addressed? “High Risk” Industries? “Undesired” Loan types? Consumer vs. Commercial? Real Estate Secured vs. Unsecured? Loans vs. Treasuries or other earning asset types? But just as in the movie, you need to be prepared for the response you may get from the traditional ‘seasoned’ lenders in your organization. When Billy Beane puts the new strategy into place at the Oakland A’s, the lead scout responds with: “You don’t put a team together with a computer” “Baseball isn’t just numbers, it isn’t science. If it was anybody could do what we do but they can’t.” “They don’t know what we know. They don’t have our experience and they don’t have our intuition.” Ah, just like the traditional baseball scout is the traditional commercial lender with the years of experience, judgment and intuition. I used to be one and used almost word for word the same argument against credit scoring and small business before I truly understood what it was all about. Don’t get me wrong. Experience, judgment and intuition is valuable and necessary. But that type of judgment tends to get into trouble when it stops looking outside for data and only relies on past personal experience to assess the next moves. Experience is always important but it has to continually review, assess and interpret the data. So let’s start looking at the different types of data. On deck – How do we know how many runs the opposition is going to score? The use of external data.
By: Joel Pruis I am going to take some liberties here. Nowhere in the movie Moneyball does Peter Brand tell us how he got to the magic number of winning 99 games to get to the playoffs. My assumption is that given the way that he evaluates the Oakland A’s, he also evaluations the other teams in their conference. Assessing the competitive landscape provides Brand with the estimated runs their opponents will generate. Now we could take the approach that such analysis would correlate to assessing how your competition is going to perform but I am going to take a different approach. I would compare the conference assessment in Moneyball to be similar to an economic forecast/assessment. We need to assess what are the overall conditions in which we must operate that will allow us to generate the net yield on the assets of our financial institution. Some of the things we need to assess to determine what we will be able to generate related to the net yield on assets would be: Gross yield on assets Current interest rate environment (yield on treasuries, federal home loan bank, etc. Interest rate trends (increasing, declining, trends toward fixed rates, variable rates) Industry information General trend of businesses across the nation How are businesses faring? How well are they paying their creditors? Are they relying more or less on credit? Are new businesses being started? Are they succeeding? Are they failing? General trends (same as above) within your financial institution’s market footprint One such source of the industry information is the Small Business Credit Index generated by Experian & Moody’s Analytics. In the recent release of the Small Business Credit Index, small business is indicating stronger from the prior quarter moving from 104.3 to 109. But this is from a national perspective. Depending on your financial institution, it is important to always get an overall view of the economy but more importantly, what is happening in your particular market footprint. Just as the Oakland A’s in Moneyball maintained an overall perspective of Major League Baseball, their focus for success was targeting their specific conference to reach the playoffs. So as we look at information such as the Small Business Credit Index, we are able to see highlights of regional trends (certain states west of the Mississippi are doing better while certain states along the east coast are not) and specific industry trends. From such data we need to drill down into our specific footprint and current portfolio. We need to review such items as: What industry concentrations do we have that are doing well in the economy and how is our portfolio doing compared to the external data? What industries are we not engaging that may provide a good opportunity for our financial institution? What changes are taking place in the general economy that may impact our ability to achieve our expected results? What external factors must we be monitoring that may impact our strategy (such as the impact of Obamacare and how it will impact the hiring for businesses with more than 50 employees?) Just as in Moneyball, Brand continues to monitor the performance of the overall league (and the individual players for future trades), we need to continually monitor the national, state and local economies to determine what adjustments we will need to make to achieve our strategies. So we have assessed the general environment, on to strategies or “How do we win 99 games with a total payroll of $38 million?”
By: Joel Pruis What is it we as bankers are trying to accomplish? If you have been in the industry for 20+ years, this question may sound ridiculous! We do what we do! We are bankers! What do you mean define what are we trying to do? But that is the question, what is it we are trying to do? I am going to propose we boil it down to the basic/fundamental element – Banks aggregate money from various sources and redeploy these funds to earn a return for the shareholders. Ultimately, our objective is to generate an appropriate return for the shareholders Getting back to the movie Moneyball, Billy Beane and Peter Brand define the objective of the Oakland A’s for the season in terms of projecting the number of wins that are needed to assure, with all probability, that the team makes the playoffs (this would be similar to the objective of banking to generate an appropriate return for the shareholders). But Peter Brand quickly moves into very specific targets that are required for the A’s to make it to the playoffs, namely win 99 regular season games. In order to win 99 regular season games, the A’s offense will need to score 814 runs in the season and defensively only allow 645 runs. Plain and simple. Very objective, very measurable and it is all based upon data, data, data. Let’s break this down. Based upon their conference, the teams in their conference along with the overall schedule, Peter Brand projects that 99 wins are necessary to land a spot in the playoffs. No gut check, no darts or crystal ball but rather historical data that when analyzed provides the benchmark of 99 wins to statistically assure the Oakland A’s that they will make the playoffs. So let’s apply this to banking. Our objective is to generate the appropriate return for our shareholders or the old Return on Equity. So, for example, if our targeted return on equity is 20% (making the playoffs) we need to make sure we generate enough net income (99 wins) through producing the necessary gross yield on assets (814 runs generated by the Oakland A’s offense) less the expected charge offs (645 runs allowed by the Oakland A’s defense). For a quick dive into details, our data would provide for a margin of error on the variable to provide for statistical assurance of achieving the objective (Return on Equity). In the movie there is no guaranty that the 814 runs will win the conference but at the same time there is no guaranty that the Oakland A’s opponents will score 645 runs. Never in the movie does the coach, Billy Beane or Peter Brand tell the team, “You only have to score X number of runs this game, don’t score anymore.” Or even crazier, “You are not letting the other team score enough runs, they need to score 645!” No, the strategy is still to generate as many runs as possible while minimizing the number of runs scored by the opposition. Rather it is the review of the total amount of earning assets of the financial institution and the overall credit quality that we must understand and control to determine our ability to generate the net yield on assets required to generate the return on equity that is required. If we assume too much risk in the portfolio in order to generate the required yield it would be similar to having a poor pitching staff projected to allow 10 runs a game requiring the team to produce 11 runs a game in order to win. It just is not realistic. So basically we need to assess at the high level, are we appropriately structured to allow for the generation of enough profit to provide the appropriate return on equity. At this point, we do not need to complicate it any further than that. Now let’s take a look at the constraints. We know we have them in banking, let’s take a look at probably the single biggest constraint imposed on Billy Beane and the Oakland A’s. In the movie, before Billy Beane is even aware of the Moneyball concept, his is given his constraint by the owner. Beane asks for more money to ‘buy players’ and is flat out rejected by the owner. The owner, in fact, cuts Beane off by asking, “is there anything else I can do for you?”. Net result is that the Oakland A’s have $38 million dollars for payroll vs. the New York Yankees at $120 million. Seriously it does not seem fair. How can you attract the needed talent when you cannot pay the type of salary needed to get the necessary players to win a championship? Let’s rephrases this for banking… How can a bank be expected to deploy its assets when such a high rate of return is required? Boiling it down to a specific example, “How can I originate a commercial loan at this rate of interest when the competition is ½ to 1% lower than our rates?” Up next – Why will 99 games get us to the playoffs? How do we assess the environment?
By: Joel Pruis Times are definitely different in the banking world today. Regulations, competition from other areas, specialized lenders, different lending methods resulting in the competitive landscape we have today. One area that is significantly different today, and for the better, is the availability of data. Data from our core accounting systems, data from our loan origination systems, data from the credit bureaus for consumer and for business. You name it, there is likely a data source that at least touches on the area if not provides full coverage. But what are we doing with all this data? How are we using it to improve our business model in the banking environment? Does it even factor into the equation when we are making tactical or strategic decisions affecting our business? Unfortunately, I see too often where business decisions are being made based upon anecdotal evidence and not considering the actual data. Let’s take, for example, Major League Baseball. How much statistics have been gathered on baseball? I remember as a boy keeping the stats while attending a Detroit Tigers game, writing down the line up, what happened when each player was up to bat, strikes, balls, hits, outs, etc. A lot of stats but were they the right stats? How did these stats correlate to whether the team won or lost, does the performance in one game translate into predictable performance of an entire season for a player or a team? Obviously one game does not determine an entire season but how often do we reference a single event as the basis for a strategic decision? How often do we make decisions based upon traditional methods without questioning why? Do we even reference traditional stats when making strategic decisions? Or do we make decisions based upon other factors as the scouts of the Oakland A’s were doing in the movie Moneyball? In one scene of the Movie, Billy Beane, general manager of the A’s, is asking his team of scouts to define the problem they are trying to solve. The responses are all very subjective in nature and only correlate to how to replace “talented” players that were lost due to contract negotiations, etc. Nowhere in this scene do any of the scouts provide any true stats for who they want to pursue to replace the players they just lost. Everything that the scouts are talking about relates to singular assessments of traits that have not been demonstrated to correlate to a team making the playoffs let alone win a single game. The scouts with all of their experience focus on the player’s swing, ability to throw, running speed, etc. At one point the scouts even talk about the appearance of the player’s girlfriends! But what if we changed how we looked at the sport of baseball? What if we modified the stats used to compile a team; determine how much to pay for an individual player? The movie Moneyball highlights this assessment of the conventional stats and their impact or correlation to a team actually winning games and more importantly the overall regular season. Bill James is given the credit in the movie for developing the methodology ultimately used by the Oakland A’s in the movie. This methodology is also referred to as Sabermetrics. In another scene, Peter Brand, explains how baseball is stuck in the old style of thinking. The traditional perspective is to buy ‘players’. In viewing baseball as buying players, the traditional baseball industry has created a model/profile of what is a successful or valuable player. Buy the right talent and then hopefully the team will win. Instead, Brand changes the buy from players to buying wins. Buying wins which require buying runs, in other words, buy enough average runs per game and you should outscore your opponent and win enough games to win your conference. But why does that mean we would have to change the way that we look at the individual players? Doesn’t a high batting average have some correlation to the number of runs scored? Don’t RBI’s (runs batted in) have some level of correlation to runs? I’m sure there is some correlation but as you start to look at the entire team or development of the line up for any give game, do these stats/metrics have the best correlation to lead to greater predictability of a win or more specifically the predictability of a winning season? Similarly, regardless of how we as bankers have made strategic decisions in the past, it is clear that we have to first figure out what it is exactly we are trying to solve, what we are trying to accomplish. We have the buzz words, the traditional responses, the non-specific high level descriptions that ultimately leave us with no specific direction. Ultimately it allows us to just continue the business as usual approach and hope for the best. In the next few upcoming blogs, we will continue to use the movie Moneyball as the back drop for how we need to stir things up, identify exactly what it is we are trying to solve and figure out how to best approach the solution.
By: Maria Moynihan Cybersecurity, identity management and fraud are common and prevalent challenges across both the public sector and private sector. Industries as diverse as credit card issuers, retail banking, telecom service providers and eCommerce merchants are faced with fraud threats ranging from first party fraud, commercial fraud to identity theft. If you think that the problem isn't as bad as it seems, the statistics speak for themselves: Fraud accounts for 19% of the $600 billion to $800 billion in waste in the U.S. healthcare system annually Medical identity theft makes up about 3% of 8.3 million overall victims of identity theft In 2011, there were 431 million adult victims of cybercrime in 24 countries In fiscal year 2012, the IRS’ specialized identity theft unit saw a 78% spike from last year in the number of ID theft cases submitted The public sector can easily apply the same best practices found in the private sector for ID verification, fraud detection and risk mitigation. Here are four sure fire ways to get ahead of the problem: Implement a risk-based authentication process in citizen enrollment and account management programs Include the right depth and breadth of data through public and private sources to best identity proof businesses or citizens Offer real-time identity verification while ensuring security and privacy of information Provide a Knowledge Based Authentication (KBA) software solution that asks applicants approved random questions based on “out-of-wallet” data What fraud protection tactics has your organization implemented? See what industry experts suggest as best practices for fraud protection and stay tuned as I share more on this topic in future posts. You can view past Public Sector blog posts here.
By: Maria Moynihan State and Federal agencies are tasked with overseeing the integration of new Health Insurance Exchanges and with that responsibility, comes the effort of managing information updates, ensuring smooth data transfer, and implementing proper security measures. The migration process for HIEs is no simple undertaking, but with these three easy steps, agencies can plan for a smooth transition: Step 1: Ensure all current contact information is accurate with the aid of a back-end cleansing tool. Back-end tools clean and enhance existing address records and can help agencies to maintain the validity of records over time. Step 2: Duplicate identification is a critical component of any successful database migration - by identifying and removing existing duplicate records, and preventing future creation of duplicates, constituents are prevented from opening multiple cases, thereby reducing the probability for fraud. Step 3: Validate contact data as it is captured. This step is extremely important, especially as information gets captured across multiple touch points and portals. Contact record validation and authentication is a best practice for any database or system gateway. Agencies and those particularly responsible for the successful launches of HIEs are expected to leverage advanced technology, data and sophisticated tools to improve efficiencies, quality of care and patient safety. Without accurate, standard and verified contact information, none of that is possible. Access the full Health Insurance Exchange Toolkit by clicking here.
Big news [last week], with Chase entering in to a 10 year expanded partnership with Visa to create a ‘differentiated experience’ for its merchants and consumers. I would warn anyone thinking “offers and deals” when they hear “differentiated experience” – because I believe we are running low on merchants who have a perennial interest in offering endless discounts to its clientele. I cringe every time someone waxes poetic about offers and deals driving mobile payment adoption – because I am yet to meet a merchant who wanted to offer a discount to everyone who shopped. There is an art and a science to discounting and merchants want to identify customers who are price sensitive and develop appropriate strategies to increase stickiness and build incremental value. It’s like everyone everywhere is throwing everything and the kitchen sink at making things stick. On one end, there is the payments worshippers, where the art of payment is the centre piece – the tap, the wave, the scan. We pore over the customer experience at the till, that if we make it easier for customers to redeem coupons, they will choose us over the swipe. But what about the majority of transactions where a coupon is not presented, where we swipe because its simply the easiest, safest and the boring thing to do. Look at the Braintree/Venmo model, where payment is but a necessary evil. Which means, the payment is pushed so far behind the curtain – that the customer spends nary a thought on her funding source of choice. Consumers are issuer agnostic to a fault – a model propounded by Square’s Wallet. Afterall, when the interaction is tokenized, when a name or an image could stand in for a piece of plastic, then what use is there for an issuer’s brand? So what are issuers doing? Those that have a processing and acquiring arm are increasingly looking at creative transaction routing strategies, in transactions where the issuer finds that it has a direct relationship with both the merchant and the consumer. This type of selective routing enables the issuer to conveniently negotiate pricing with the merchant – thereby encouraging the merchant to incent their customers to pay using the card issued by the same issuer. For this strategy to succeed, issuers need to both signup merchants directly, as well as encourage their customers to spend at these merchants using their credit and debit cards. FI’s continue to believe that they can channel customers to their chosen brands, but “transactional data doth not maketh the man” – and I continue to be underwhelmed by issuer efforts in this space. Visa ending its ban on retailer discounts for specific issuer cards this week must be viewed in context with this bit – as it fuels rumors that other issuers are looking at the private payment network option – with merchants preferring their cards over competitors explicitly. The wild wild west, indeed. This drives processors to either cut deals directly with issuers or drives them far deeper in to the merchant hands. This is where the Braintree/Venmo model can come in to play – where the merchant – aided by an innovative processor who can scale – can replicate the same model in the physical world. We have already seen what Chase Paymentech plans to do. There aren’t many that can pull off something similar. Finally, What about Affirm, the new startup by Max Levchin? I have my reservations about the viability of a Klarna type approach in the US – where there is a high level of credit card penetration among the US customers. Since Affirm will require customers to choose that as a payment option, over other funding sources – Paypal, CC and others, there has to be a compelling reason for a customer to choose Affirm. And atleast in the US, where we are card-entrenched, and everyday we make it easier for customers to use their cards (look at Braintree or Stripe) – it’s a tough value proposition for Affirm. Share your opinions below. This is a re-post from Cherian's personal blog at DropLabs.
Last January, I published an article in the Credit Union Journal covering the trend among banks to return to portfolio growth. Over the year, the desire to return to portfolio growth and maximize customer relationships continues to be a strong focus, especially in mature credit markets, such as the US and Canada. Let’s revisit this topic, and start to dive deeper into the challenges we’ve seen, explore the core fundamentals for setting customer lending limits, and share a few best practices for creating successful cross-sell lending strategies. Historically, credit unions and banks have driven portfolio growth with aggressive out-bound marketing offers designed to attract new customers and members through loan acquisitions. These offers were typically aligned to a particular product with no strategy alignment between multiple divisions within the organization. Further, when existing customers submitted a new request for credit, they were treated the same as incoming new customers with no reference to the overall value of the existing relationship. Today, however, financial institutions are looking to create more value from existing customer relationships to drive sustained portfolio growth by increasing customer retention, loyalty and wallet share. Let’s consider this idea further. By identifying the needs of existing customers and matching them to individual credit risk and affordability, effective cross-sell strategies that link the needs of the individual to risk and affordability can ensure that portfolio growth can be achieved while simultaneously increasing customer satisfaction and promoting loyalty. The need to optimize customer touch-points and provide the best possible customer experience is paramount to future performance, as measured by market share and long-term customer profitability. By also responding rapidly to changing customer credit needs, you can further build trust, increase wallet share and profitably grow your loan portfolios. In the simplest sense, the more of your products a customer uses, the less likely the customer is to leave you for the competition. With these objectives in mind, financial organizations are turning towards the practice of setting holistic, customer-level credit lending parameters. These parameters often referred to as umbrella, or customer lending, limits. The challenges Although the benefits for enhancing existing relationships are clear, there are a number of challenges that bear to mind some important questions: How do you balance the competing objectives of portfolio loan growth while managing future losses? How do you know how much your customer can afford? How do you ensure that customers have access to the products they need when they need them What is the appropriate communication method to position the offer? Few credit unions or banks have lending strategies that differentiate between new and existing customers. In the most cases, new credit requests are processed identically for both customer groups. The problem with this approach is that it fails to capture and use the power of existing customer data, which will inevitably lead to suboptimal decisions. Similarly, financial institutions frequently provide inconsistent lending messages to their clients. The following scenarios can potentially arise when institutions fail to look across all relationships to support their core lending and collections processes: Customer is refused for additional credit on the facility of their choice, whilst simultaneously offered an increase in their credit line on another. Customer is extended credit on a new facility whilst being seriously delinquent on another. Customer receives marketing solicitation for three different products from the same institution, in the same week, through three different channels. Essentials for customer lending limits and successful cross-selling By evaluating existing customers on a periodic (monthly) basis, financial institutions can holistically assess the customer’s existing exposure, risk and affordability. By setting customer level lending limits in accordance with these parameters, core lending processes can be rendered more efficient, with superior results and enhanced customer satisfaction. This approach can be extended to consider a fast-track application process for existing relationships with high value, low risk customers. Traditionally, business processes have not identified loan applications from such individuals to provide preferential treatment. The core fundamentals of the approach necessary for the setting of holistic customer lending (umbrella) limits include: The accurate evaluation of credit and default rise The calculation of additional lending capacity and affordability Appropriate product offerings for cross-sell Operational deployment Follow my blog series over the next few months as we explore the core fundamentals for setting customer lending limits, and share a few best practices for creating successful cross-sell lending strategies.
First, it aims to drastically reduce payment acceptance costs through any and all means and Secondly – keep merchant data firmly within their purview. MCX – MerChants reduX: The post that follows is a collection of thoughts around MCX, why it deserves respect, and yet how it is indeed mortal and bleeds like all others. For those who are not familiar with MCX – it’s a consortium of over 30 leading national retailers with a singular purpose – that is, to create a seamlessly integrated mobile commerce platform. The website for MCX is http://www.mcx.com. The consortium is led by merchants like Walmart, Target, CVS, BestBuy, Gap, Sears etc. By 2012, the mobile payments space was fragmented as it is, which itself may have precipitated the launch of MCX. And to a number of solutions looking for traction, things ground to a halt when MCX conceptualized to the merchants a solution that needed no costly upgrades and a promise to route the transaction over low cost routing options. My friends on the issuer side privately confide that MCX has infact succeeded in throwing a monkey wrench in their mobile payment plans – and merchant acceptance looks to be ambiguous around incumbent initiatives such as Isis and GoogleWallet, as well as for alternative payment initiatives. It had been easy to call it mere posturing and ignore it in the early days, but of late there is a lot of hand wringing behind the scenes and too many furrowed brows, as if the realization finally struck that merchants were indeed once again crucial to mobile payment adoption. MCX – It’s raison d’etre Meanwhile, the stakeholders behind MCX have been religious in their affirmation that MCX lives by two core tenets: First, it aims to drastically reduce payment acceptance costs through any and all means and Secondly – keep merchant data firmly within their purview. I can’t seem to think that the latter was any more than an after thought, because merchants individually can choose to decide if they wish to share customer preferences or Level III data with third parties, but they need all the collective clout they can muster to push networks and issuers to agree to reduce card acceptance costs. So if one distils MCX down to its raison d’etre, then it looks that it is aimed squarely at No.1. Which is fair when you consider that the merchants believe card fees are one of their biggest operating expenses. In 2007, 146,000 convenience stores and gas stations nationwide made a total of $3.4B in profits, yet they paid out $7.6B in card acceptance costs(Link). And MCX is smart to talk about the value of merchant data, the need to control it, yada yada yada. But if that were indeed more important, Isis could have been the partner of choice – someone who would treat customer and transaction data as sacrosanct and leave it behind for the merchants to fiddle with(vs. GoogleWallet’s mine..mine..mine.. strategy). But the same way HomeDepot was disappointed when they first saw GoogleWallet – no interchange relief, incremental benefits at the point-of-sale, and swoops all their data in return, Isis also offers little relief to MCX or its merchants, even without requiring any transaction or SKU level data in return. Does it mean that Carriers have no meaningful role to play in commerce? Au contraire. They do. But its around fraud and authentication. Its around Identity. And creating a platform for merchants to deliver coupons, alerts to opted-in customers. But they seem to be stuck imitating Google in figuring out a play at the front end of the purchase funnel, to become a consumer brand. The last thing they want to do is leave it to Apple to figure out the “Identity management” question, which the latter seems best equipped to answer by way of scale, the control it exerts in the ecosystem, its vertical integration strategy that allows it to fold in biometrics meaningfully in to its lineup, and to start with its own services to offer customer value. Did we say Apple? Its a bit early to play fast and loose with Apple predictions, but its Authentec acquisition should rear its head sometime in the near future (2013 – considering Apple’s manufacturing lead times), that a biometric solution packaged neatly with an NFC chip and secure element could address three factors that has held back customer adoption of biometrics: Ubiquity of readers, Issues around secure local storage and retrieval of biometric data, Standardization in accessing and communicating said data. An on-chip secure solution to store biometric data – in the phone’s secure element can address qualms around a central database of biometric data open to all sorts of malicious attacks. Standard methods to store and retrieve credentials stored in the SE will apply here as well. Why NFC? If NFC was originally meant to seamlessly and securely share content, what better way to sign that content, to have it be attributable to its original author, or to enforce one’s rights to said content – than to sign it with one’s digital signature. Identity is key, not just when enforcing digital rights management on shared content, but also to secure commerce and address payment/fraud risk. Back to MCX. The more I read the more it seems MCX is trying to imitate Isis in competing for the customer mindshare, in attempting to become a consumer brand – than simply trying to be a cheaper platform for payment transactions. As commerce evolved beyond being able to be cleanly classified under “Card Present” and “Card Not Present” – as transactions originate online but get fulfilled in stores, merchants expect rules to evolve alongside reality. For example, when customers are able to order online, but pick up in-store after showing a picture ID, why would merchants have to pay “Card not Present” rates when risk is what we attribute higher CNP rates to, and why is there an expectation of the same amount of risk even in this changed scenario? And beyond, as technology innovation blurs the lines that neatly categorized commerce, where we replace “Card Present” with “Mobile Present”, and mobile carry a significant amount of additional context that could be scored to address or quantify risk, why shouldn’t it be?. It’s a given that networks will have to accommodate for reduced risk in transactions where mobile plays a role, where the merchant or the platform enabling the transaction can meaningfully use that context to validate customer presence at the point-of-sale – and that they will expect appropriate interchange reduction in those scenarios. MCX – A brand like Isis or a platform? But when reading portions of the linked NRF blog, and elsewhere – it reflects a misplaced desire on MCX’s part to become a consumer facing solution – an app that all MCX partners will embrace for payment. This is so much like the Isis solution of today – that I have written about – and why it flies in the face of reason. Isis – the nexus between Carriers and FI’s – is a powerful notion, if one considers the role it could play in enabling an open platform – around provisioning, authentication and marketing. But for that future to materialize, Isis has to stop competing with Google, and must accept that it has little role to play by itself at the front end of the funnel, and must recede to its role of an enabler – one that puts its partner FI brands front and center, allows Chase’s customers to pay using Chase’s mobile app instead of Isis, and drives down the fraud risk at the point of sale by meaningfully authenticating the customer via his location and mobile assets Carriers control, and further – the historical data they have on the customer. It’s those three points of data and the scale Isis can bring, that puts them credibly in the payments value chain – not the evaporating control around the Secure Element. In the same vein, the value MCX brings to merchants – is the collective negotiating power of over 30 national merchants. But is it a new consumer brand, or is it a platform focused on routing the transaction over the least cost routing option. If its the latter, then it has a strong parallel in Paypal. And as we may see Paypal pop-up as legal tender in many a retailer’s mobile apps and checkout aisles going forward, MCX is likely to succeed by emulating that retailer aligned strategy than follow a brand of its own. Further, If MCX wants customers to pay using less costly means – whether they be private label, prepaid or ACH – then it and its partners must do everything they can to shift the customer focus away from preferred payment methods and focus on the customer experience and resulting value around loyalty. MCX must build its value proposition elsewhere, and make their preferred payment methods the bridge to get the customer there. Another example where the retailer focused too much on the payment, and less so on the customer experience is the Safeway Fast Forward program. The value proposition is clear for the customer – Pay using your Safeway Fast Forward card number and a self assigned PIN for simpler checkout. However to set up your account, the customer must provide a State issued ID (Drivers License) and on top of it – his Social Security Number(Safeway Fast Forward Requirements Here). What customer would, for the incremental convenience of paying via his Fast Forward Card and PIN, be willing to entrust Safeway with his Social Security Number? Clearly Safeway’s Risk team had a say in this and instead of coming up with better ways to answer questions around Risk and Fraud, they introduced a non-starter, which killed any opportunity for meaningful adoption. MCX & adoption So where does that leave MCX? Why will I use it? How will it address questions around adoption? It’s a given that it will have to answer the same questions around fraud and authentication during customer on-boarding or at a transactional level. Further, its not enough these days to simply answer questions pertaining to the customer. Further, one must address questions relating to the integrity and reputation of the device the customer use – whether that be a mobile device or a Laptop PC. But beyond fraud and auth, there are difficult questions around what would compel a techno-luddite who has historically paid using a credit instrument to opt for an ACH driven(i am guessing) MCX payment scheme. Well, for one: MCX and its retail partners can control the purchasing power parity of MCX credits. If they so wish, and after aggregating customer profiles across retailers, MCX determines that the Addams family spends a collective $400 on average per month between all the MCX retailers. MCX could propose that if instead, the Addams family were to commit to buy $450 in MCX credits each month, they could increase their purchasing power an additional $45 credits that could be used on specific retail categories (or flat out across all merchandise)? Would Morticia be interested? If she did, what does that mean to MCX? It eliminated having to pay interchange on approx $500, and further it enabled its partners to capture an incremental spend of 10% that did not exist before. Only merchants will be able to pull this off – by leveraging past trends, close relationships with CPG manufacturers and giving Morticia new reasons to spend in the manner they want her to. But then again, where does MCX stop in providing a level playing field for its partners, and step back – so that merchants can start to compete for their customers and their spend? And finally, can it survive the natural conflicts that will arise, and limit its scope to areas that all can agree – for long enough for it to take root? Should MCX become the next Isis or the next Paypal? Which makes most sense? What do you think? Please leave your opinions below... (This blog post is an adaptation of its original post found - http://www.droplabs.co/?p=662)
By: Maria Moynihan Fact: In fiscal year 2011, the federal government allocated ~$608M to investigate and prosecute cases of alleged fraud in health care programs Fact: Medicare and Medicaid related scams cost taxpayers more than $60B a year These statistics are profound, especially when so many truly need–and rightfully deserve–access to health benefits. To make the facts a bit more tangible, how would you feel if you heard that neighbors of yours were submitting claims to Medicare for treatments that were never provided? In essence, you’ve got thieves for neighbors, don’t you? Thankfully, government agencies are responding. Even while being challenged with reduced budgets and limited resources; they are investing in efficient processes, advanced data, analytics and decisioning tools to improve their visibility into individuals at the point of application. By simply making adjustments to one or all of these areas, agencies can pinpoint whether or not individuals are who they say they are. Only with precision, relevancy, and efficiency of information, can fraud and abuse be curtailed. Below are a few examples of how to improve your eligibility systems or processes today. Or, simply download the Issue Brief, Beyond Traditional Eligibility Verification, for more detail. Use scores, models, and screening questions to assess a beneficiary’s true identity or level of identity fraud risk. Use income and asset estimation models to compare to stated income as a validation step in determination of benefits eligibility. Create a single system for automatic identification and verification of beneficiaries and businesses applying for service. Tighten controls around business identity to weed out fraud rings, syndicates and other forms of business fraud. The Bottom Line: Only with process, information, or system improvements, can government agencies move the needle on the growing and pressing issue of fraud and abuse.