Log in

goodpods headphones icon

To access all our features

Open the Goodpods app
Close icon
headphones
Making Data Better

Making Data Better

Lockstep Consulting Pty Ltd

Making Data Better is a podcast about data quality and the impact it has on how we protect, manage, and use the digital data critical to our lives. Through conversation and examination, George Peabody and Stephen Wilson look at data's role in risk management, at use cases like identification, lending, age verification, healthcare, and more personal concerns. Privacy and data ownership are topics, as are our data failures.

Share icon

All episodes

Best episodes

Seasons

Top 10 Making Data Better Episodes

Goodpods has curated a list of the 10 best Making Data Better episodes, ranked by the number of listens and likes each episode have garnered from our listeners. If you are listening to Making Data Better for the first time, there's no better place to start than with one of these standout episodes. If you are a fan of the show, vote for your favorite Making Data Better episode by adding your comments to the episode page.

Making Data Better - EP16: Data, Governance, and Public Service: Ian Oppermann
play

07/28/24 • 39 min

The focus of computer technology historically has been on the manipulation and communication of data and information. Yes, there’s always been the monstrously obvious admonition of “garbage in, garbage out” when speaking of data. But as our dependence on data grows, the issues of data quality, of making data better, have grown in importance and complexity. Data, it turns out, is endlessly nuanced.
Government Data Generation and Usage
Government has an enormous interest in data. It is an issuer of data when it assigns account numbers, for example, to its citizens to ease service delivery. It is also a considerable consumer of data in order to establish policy, measure program efficiency, support planning, and, just as with any business or individual, for decision making of all kinds.
But this isn’t simple. The term “government” masks the fact that multiple agencies exist, each with its own goals, never mind data handling policies and procedures. Sharing data across industries is as nuanced as data sharing between enterprises or even more so.
Understanding how governments think about the data they consume and generate is key to long term data security and online identity.
Talking with Data Expert Ian Oppermann
In this fascinating and stimulating conversation, Steve and George discuss these topics with Ian Oppermann, the former data director for the state of New South Wales, a director for Standards Australia, and advisor to multiple startups.
Ian shares his insider’s knowledge of government agency priorities and the fact that sharing data across agencies is “extraordinarily hard.”
Just at the Beginning
Standards Really Really Matter
Ian’s participation in ISO standards development comes from his insight that data sharing requires very crisp definitions, detailed use cases, and specific guidance for each use case based on privacy and data custodianship requirements. And he points out that we are just at the beginning.
For example, the latest ISO standards tackle the basics of terminology definition and use cases, ISO 5207, and guidance of data usage, ISO 5212.
These standards do address the use case of AI but even at this stage the standards address the basics.
People Matter
As with many technology management concerns these days, the concerns are rarely about the tech itself. They’re about people, too. Here’s Ian:
“If you want to use [data] for important purposes, you actually need people who know and understand what data is, who know and understand what data governance is, and who know and understand how to actually use the data for appropriate purposes and then put guidance restrictions or prohibitions around the data products you create.”

Ian concludes with:
“But [for] the general use of data, we're only just beginning to understand the power, the complexity, the mercurial nature of data and starting to build frameworks around it.”

Take a listen if you care about data management and governance in large organizations. We are just at the beginning of getting this right.

bookmark
plus icon
share episode

Credential sharing is complex and exciting. Take a listen to our guest, Dan Stemp from JNCTN, in this installment of Making Data Better. We discuss JNCTN's credential sharing platform and its major use cases.
Discover how managing digital identities supports the work of critical industries, from power generation to healthcare. We unpack the intricate relationships between those who rely on credentials, the individuals who hold them, and the authorities who issue them.
Dan tells us the story of his firm's evolution from card personalization bureau to today's digital credential management scheme. We discuss the firm's clients' transition from physical tokens to digital credential presentation. Of course, we discuss wallets because they are the natural containers to hold verifiable credentials and we address JNCTN's proprietary approaches, the W3C, and big players like Apple and Google.
Implementation of systemic systems is never easy. JNCTN has multiple stakeholders to convince. We examine enterprise adoption and the leverage points that resonate with the relyingn parties, the risk owners, who deploy these systems. It's not just about risk mitigation and operational efficiency.
So, take a listen as Dan, Steve, and George share their enthusiasm for verifiable credential sharing and the breadth of applications ahead.

bookmark
plus icon
share episode

A key actor in risk assessment is the data provider. These commercial operations aggregate and analyze the data produced by governments, enterprises, individuals, and even other data providers. All to feed today’s insatiable appetite for understanding who it is we are dealing with online.
In this Making Data Better episode, Steve and George are joined by Cindy Printer, Director, Financial Crime Compliance and Payments, at LexisNexis Risk Solutions. The company is a major data provider to government and enterprise; Cindy focuses her work on financial services firms and their need for regulatory compliance.
We discuss the granular nature of the data LexisNexis Risk Solutions offers its customers and the breadth of sources used to meet their needs. It’s astonishing.
Cindy makes the point, one we heartily agree with at Lockstep, that risk is specific, a concern for each individual entity and that the data required by each entity varies based upon its specific concerns. And that’s why LexisNexis Risk Solutions tunes the data services it provides to the industry segment and individual firm.
Sitting on top of such vast data resources and knowing the complications associated with deriving meaning from it all, LexisNexis Risk Solutions also provides analytical services that saves an enterprise from having to analyze the data itself.
This is a great conversation if you want to understand the data provider role, the scale of its operations, and its priorities. So take a listen.

bookmark
plus icon
share episode
Making Data Better - EP2: Data in our digital lives: Many moving parts | 01
play

06/01/23 • 30 min

Online trust? Digital identification? Wallets? Privacy? Data quality? If these topics resonate, join Lockstep's George Peabody and Steve Wilson in this, the first episode of Making Data Better, a podcast about data quality and the impact it has on how we protect, manage, and use the digital data critical to our lives.
We focus on making data better because data quality has everything to do with making good decisions. We introduce the critical and complex concerns that affect online trust and risk management.
One area where this really applies is in assessing risk. We believe data quality has an outsized impact on risk management. From relatively trivial decisions about where to go out to eat, to opening new bank accounts and sending money cross-border, each carries risk. Do we trust the data? Does it let us know who or what is on the other end of our digital interaction?
There are so many moving parts to understand, evaluate, and discuss. We’ll unpack all this with experts from fintech, cyber, healthcare, government services, and academia.
So take a listen and join us on this journey.

bookmark
plus icon
share episode
Making Data Better - EP3: Data in our digital lives: Many moving parts | 02
play

06/13/23 • 25 min

Self-sovereign identity, digital wallets, risk management, federated identity, and a recap of the Identiverse 2023 conference are among the topics discussed in this episode of Making Data Better. George and Steve complete their landscape view of data quality and security.
Identiverse 2023 reflected those topics, indicating a shift toward digital wallets, passkeys, and verifiable credentials as essential components of a resilient digital ecosystem. We discuss these and what's not being addressed to date.
So take a listen as we set the table for Making Data Better and begin our series. We will speak to practitioners of the latest in data protection, tell the stories of data quality initiatives that have succeeded—and failed—and regularly rise above the weeds to provide perspective.
With so many moving parts and so many changes, Making Data Better keeps it all in context.

bookmark
plus icon
share episode
Making Data Better - EP1: Introducing Making Data Better
play

05/25/23 • 2 min

Making Data Better is a podcast about data quality and the impact it has on how we protect, manage and use the digital data critical to our lives. Take a listen to this intro. We're launching the first episodes in June 2023.

bookmark
plus icon
share episode

In October 2008 Heartland Payment Systems discovered it had been breached. Albert Gonzalez and several other individuals hacked their way through an external company website using SQL injection — an attack that too often still works — to the core of Heartland’s systems. They were able to copy credit and debit card numbers and other data used in payment authorization.
At the time, that data enabled those who bought it to create new magstripe cards.
Some stats about the hack:

  • Heartland’s stock price fell by 77% in the months following the attack.
  • Some 130 million card numbers were exposed.
  • Heartland paid $60 million in fines to Visa, over $40M to Mastercard, $5M to Discover, and $3.6M to AMEX.
  • The business of signing up merchants to accept cards using Heartland’s services took a big hit.

To me, this is also something of a hero story. Because Heartland’s leadership, led by CEO Bob Carr, got angry. Yes, at the hackers. But more important they took that anger and frustration and used it to fill a gaping hole in card system security, way out in front of what the card systems themselves required.
I was fortunate enough to play a minor part in Heartland’s response. As an analyst, I got to know some key players who will tell their part of the story in this episode.

bookmark
plus icon
share episode
Making Data Better - EP5: Navigating digital ID: The role of government
play

11/07/23 • 41 min

Ever wondered about government's role in online identification and how it could expand to help our digital economy function better and safer? Or how government data quality directly impacts risk assessment?
In this episode of Making Data Better, we tackle where US and Australian governments stand on protecting our digital IDs and personal credentials.
Join us as Jeremy Grant, Managing Director of Technology Business Strategy at Venable LLP, brings his insights on security technology strategy, policy, finance, and more to Making Data Better. Jeremy speaks to his decades-long experience with US federal and state government initiatives. And to the work of his organization, the Better Identity Coalition (check out its policy papers for federal and state-level policymakers!)
Government issues the credentials we rely on to prove who we are. Regulating how those credentials may be protected to enjoy expanded usage is both necessary and fraught with complications. Tech regulation has a history of being well behind technology's evolution.
That said, it is coherent policy and political direction that is needed. Disparate agencies may fully understand the potential of the assets they manage but without strategic focus at the highest level, the challenge of digital ID will remain. And our exposure to fraudsters, synthetic identities, and nation-state attacks will continue.
This is no small matter. FINCEN, the Treasury Department's Financial Crimes Enforcement Network, recently announced their analysis of bank-filed suspicious activity reports. They found that $212 billion of transactions were tied to compromised identity. The General Accounting Office, the investigative arm of the US Congress, estimated between $100 and $135 billion losses in public benefits fraud during the pandemic.
This is real money, ending up in the hands of organized criminals and adversarial nation-states.
So, take a listen to this episode with Jeremy Grant and Lockstep's Steve Wilson and George Peabody. There's work to be done.

bookmark
plus icon
share episode

Data provides the basis for how we make decisions. An enemy of security these days, from our point of view, is plain text. We need better than that. We need device-assisted support for proving where data comes from and how it's been handled. We need systems that keep data (and code) from being altered without cause, that give us the ability to trace the change history of data.
Confidential computing is a new compute paradigm that provides a hardware-based foundation for running code and the data it manipulates. It safeguards data and code (it's all data; it's all code) in its most vulnerable state: while it's being processed.
In this episode of Making Data Better Steve and George are joined by Anjuna's Mark Bauer to dive into this new model's high impact on security and low impact on cloud app development.
Mark dissects the mechanics behind this approach including how it strengthens the software supply chain through hardware-based attestation. He addresses its fit in modern cloud infrastructure including Kubernetes, data loss prevention (DLP), API scanning and more.
The conversation addresses the initial major use cases for confidential computing. High risk environments including defense, banking, and healthcare are obvious. Not so obvious is securing multi-party data sets in the cloud for machine learning and AI-based applications.
So take a listen to this episode of Making Data Better and learn how hardware-based security can harden the cloud.

bookmark
plus icon
share episode

Australia's online security community - government, banking, and providers - has made a major, deliberate move. Over the last year, the term "digital identity" has been replaced by "digital ID" in government and industry publications and press releases.
Steve and George unravel this deliberate linguistic shift from the amorphous 'digital identity' to the more concrete and pragmatic 'digital ID', and understand why this nuanced change is more than mere semantics. It's a shift that promises greater clarity in technology, legislation, and personal identification. Tune in and explore a future where proving who you are online is not just more secure, but refreshingly straightforward.
Why does this matter? As Steve and George discuss it in this episode of Making Data Better, it accurately shifts the focus of policy and technical work on to the quality of the data used for identification purposes. It frees everyone from the impossible tasks of representing our "identities" digitally.
Through our discussion, we celebrate the strides made by Australia in addressing the advancement of digital identity systems—a contrast to the comparatively uncoordinated, market-driven efforts seen in the U.S.
Steve and George conclude with their perspective on how to secure digital IDs using device-assisted presentation. Plaintext presentation is the enemy. It gives hackers endless opportunities to copy (via data breaches) and replay (via fraud) that data. There is a straightforward solution that we have done before: marry the cryptographic strength of how chip cards are secured to the convenience of smartphone presentation and we have the opportunity to remove breach incentive by making our digital ID data better.
There's plenty of work ahead but there's great power in what could be an uncontroversial, technically practical, achievable approach. So take a listen.

bookmark
plus icon
share episode

Show more best episodes

Toggle view more icon

FAQ

How many episodes does Making Data Better have?

Making Data Better currently has 16 episodes available.

What topics does Making Data Better cover?

The podcast is about Identity, Data, Risk, Payments, Podcasts, Technology and Cybersecurity.

What is the most popular episode on Making Data Better?

The episode title 'EP3: Data in our digital lives: Many moving parts | 02' is the most popular.

What is the average episode length on Making Data Better?

The average episode length on Making Data Better is 30 minutes.

How often are episodes of Making Data Better released?

Episodes of Making Data Better are typically released every 19 days, 1 hour.

When was the first episode of Making Data Better?

The first episode of Making Data Better was released on May 25, 2023.

Show more FAQ

Toggle view more icon

Comments