Researchers Submit Patent Application, “Neural Network Encoders And Decoders For Physician Practice Optimization”, for Approval (USPTO 20200034707) - Insurance News | InsuranceNewsNet

InsuranceNewsNet — Your Industry. One Source.™

Sign in
  • Subscribe
  • About
  • Advertise
  • Contact
Home Now reading Newswires
Topics
    • Advisor News
    • Annuity Index
    • Annuity News
    • Companies
    • Earnings
    • Fiduciary
    • From the Field: Expert Insights
    • Health/Employee Benefits
    • Insurance & Financial Fraud
    • INN Magazine
    • Insiders Only
    • Life Insurance News
    • Newswires
    • Property and Casualty
    • Regulation News
    • Sponsored Articles
    • Washington Wire
    • Videos
    • ———
    • About
    • Advertise
    • Contact
    • Editorial Staff
    • Newsletters
  • Exclusives
  • NewsWires
  • Magazine
  • Newsletters
Sign in or register to be an INNsider.
  • AdvisorNews
  • Annuity News
  • Companies
  • Earnings
  • Fiduciary
  • Health/Employee Benefits
  • Insurance & Financial Fraud
  • INN Exclusives
  • INN Magazine
  • Insurtech
  • Life Insurance News
  • Newswires
  • Property and Casualty
  • Regulation News
  • Sponsored Articles
  • Video
  • Washington Wire
  • Life Insurance
  • Annuities
  • Advisor
  • Health/Benefits
  • Property & Casualty
  • Insurtech
  • About
  • Advertise
  • Contact
  • Editorial Staff

Get Social

  • Facebook
  • X
  • LinkedIn
Newswires
Newswires RSS Get our newsletter
Order Prints
February 17, 2020 Newswires
Share
Share
Tweet
Email

Researchers Submit Patent Application, “Neural Network Encoders And Decoders For Physician Practice Optimization”, for Approval (USPTO 20200034707)

Insurance Daily News

2020 FEB 17 (NewsRx) -- By a News Reporter-Staff News Editor at Insurance Daily News -- From Washington, D.C., NewsRx journalists report that a patent application by the inventors Kivatinos, Daniel (Mountain View, CA); Nusimow, Michael (Mountain View, CA); Borgt, Martin (Sunnyvale, CA); Waychal, Soham (Sunnyvale, CA), filed on June 28, 2019, was made available online on January 30, 2020.

The patent’s assignee is drchrono Inc. (Sunnyvale, California, United States).

News editors obtained the following quote from the background information supplied by the inventors: “When physicians bill for a treatment, such as a procedure or office visit, they record the treatments using standardized codes. These standardized codes are submitted to the payer, such as an insurance company, Medicaid, or Medicare. For example, standardized codes may be used for procedures like examining the patient’s knee or setting a broken arm. Codes are sometimes numerical or alphanumeric. Some code systems include CPT, ICD-9, ICD-10, SNOMED, LOINC, RxNorm, HCPCS, and others. Some code systems are more specialized for diagnosis, while others are more specialized for procedures and payment. Some code systems have as many as tens of thousands of codes.

“Partly due to the overwhelming number of codes, physicians maintain fee schedules, which are lists of the codes commonly used in their practice. This allows physicians to see what codes they commonly use for particular procedures. For example, a procedure as simple as setting a broken ankle may involve multiple codes, rather than just one, and the physician refers to his or her fee schedule to determine what codes to record when submitting the bill to the payer.

“However, a problem in the art is that physicians may not know the best codes to include in their fee schedule and may not know all of the relevant codes to include in their fee schedule. This is an acute problem because of the large number of codes in many coding systems, and the opaqueness of payer policies that result in physicians not knowing which codes will be paid. One result of the problem is that new physicians may have difficulty setting up their own practices because they do not know how to correctly do billing, unless they have spent several years in training at an existing practice. Moreover, many physicians may not be optimizing their billings simply because they do not know the correct codes, leading some physicians to collect more payment for the same work for administrative reasons rather than quality of work.”

As a supplement to the background information on this patent application, NewsRx correspondents also obtained the inventors’ summary information for this patent application: “Some embodiments relate to a machine learning system for predicting billing codes for a physician fee schedule. The machine learning system can be used to suggest additional billing codes that may be added to a physician’s existing fee schedule. The predictions may be made based on the existing billing codes in the physician’s fee schedule and also the physician’s recently used billing codes in their recent billing claims.

“In one embodiment, a billing code encoder is provided that encodes billing codes into a first vector representation. In one embodiment, a fee schedule encoder is provided that encodes fee schedules into a second vector representation. In one embodiment, a billing claims encoder is provided that encodes sets of recent billing claims into a third vector representation. Each vector representation relates similar entities in a vector space by locating them closer together and causes dissimilar entities to be farther apart in the vector space.

“In one embodiment, a physician fee schedule is provided and a set of recent billing claims of the physician are provided. The individual billing codes in the fee schedule and set of recent billing claims are encoded using the billing code encoder. The fee schedule is then encoded by the fee schedule encoder, and the recent set of billing claims are encoded by the billing claims encoder. The encoded fee schedule and encoded set of recent billing claims may be input to a decoder to predict a billing code to add to the fee schedule.

“In one embodiment, the encoders and the decoder are machine learning models that are trained using ground-truth training examples.”

The claims supplied by the inventors are:

“1. A computer-implemented method for recommending one or more codes for a physician code schedule, the method comprising: training a first neural network encoder to encode codes into a first vector representation, the first vector representation relating codes that are similar; training a second neural network encoder to encode physician code schedules into a second vector representation, the second vector representation relating physician code schedules that are similar, wherein the physician code schedules comprise one or more codes; training a third neural network encoder to encode claims into a third vector representation, the third vector representation relating claims that are similar, wherein the claims comprise one or more codes; training a neural network decoder to accept as input an encoded physician code schedule and an encoded set of claims to output one or more predicted codes; recommending a code to add to a physician code schedule of a physician by: providing the physician code schedule; providing a set of recent claims of the physician; inputting the physician code schedule to the second neural network encoder to output an encoded physician code schedule; inputting the set of recent claims of the physician to the third neural network encoder to output an encoded set of recent claims of the physician; inputting the encoded physician code schedule and the encoded set of recent claims of the physician into the neural network decoder to output a predicted code.

“2. The computer-implemented method of claim 1, wherein the first neural network encoder creates word embeddings of codes by using a skip-gram model.

“3. The computer-implemented method of claim 1, wherein the second neural network encoder is a long short-term memory (LSTM) neural network, and the second vector representation is created based on the internal state of the second neural network encoder.

“4. The computer-implemented method of claim 1, wherein the second neural network encoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second neural network encoder; and training the second neural network encoder to output the one or more removed codes.

“5. The computer-implemented method of claim 1, wherein the third neural network encoder is a long short-term memory (LSTM) neural network, and the third vector representation is created based on the internal state of the third neural network encoder.

“6. The computer-implemented method of claim 1, wherein the third neural network encoder is trained by: removing one or more codes from a ground-truth set of recent claims; inputting the ground-truth set of recent claims with the one or more codes removed into the third neural network encoder; and training the third neural network encoder to output the one or more removed codes.

“7. The computer-implemented method of claim 1, wherein the neural network decoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second neural network encoder to output an encoded ground-truth physician code schedule with the one or more codes removed; providing a ground-truth set of recent claims; inputting the ground-truth set of recent claims into the third neural network encoder to output an encoded ground-truth set of recent claims; inputting the encoded ground-truth physician code schedule with the one or more codes removed and the encoded ground-truth set of recent claims in the neural network decoder, and training the neural network decoder to output the one or more removed codes.

“8. A computer-implemented method for recommending one or more codes for a physician code schedule, the method comprising: training a first encoder to encode codes into a first vector representation, the first vector representation relating codes that are similar; training a second encoder to encode physician code schedules into a second vector representation, the second vector representation relating physician code schedules that are similar, wherein the physician code schedules comprise one or more codes; training a third encoder to encode claims into a third vector representation, the third vector representation relating claims that are similar, wherein the claims comprise one or more codes; training a decoder to accept as input an encoded physician code schedule and an encoded set of claims to output one or more predicted codes; recommending a code to add to a physician code schedule of a physician by: providing the physician code schedule; providing a set of recent claims of the physician; inputting the physician code schedule to the second encoder to obtain an encoded physician code schedule; inputting the set of recent claims of the physician to the third encoder to obtain an encoded set of recent claims of the physician; inputting the encoded physician code schedule and the encoded set of recent claims of the physician into the decoder to output a predicted code.

“9. The computer-implemented method of claim 8, wherein the first encoder creates word embeddings of codes by using a skip-gram model.

“10. The computer-implemented method of claim 8, wherein the second encoder is a long short-term memory (LSTM) neural network, and the second vector representation is created based on the internal state of the second neural network encoder.

“11. The computer-implemented method of claim 8, wherein the second encoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second encoder; and training the second encoder to output the one or more removed codes.

“12. The computer-implemented method of claim 8, wherein the third encoder is a long short-term memory (LSTM) neural network, and the third vector representation is created based on the internal state of the third encoder.

“13. The computer-implemented method of claim 8, wherein the third encoder is trained by: removing one or more codes from a ground-truth set of recent claims; inputting the ground-truth set of recent claims with the one or more codes removed into the third encoder; and training the third encoder to output the one or more removed codes.

“14. The computer-implemented method of claim 8, wherein the decoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second encoder to output an encoded ground-truth physician code schedule with the one or more codes removed; providing a ground-truth set of recent claims; inputting the ground-truth set of recent claims into the third encoder to output an encoded ground-truth set of recent claims; inputting the encoded ground-truth physician code schedule with the one or more codes removed and the encoded ground-truth set of recent claims in the decoder, and training the decoder to output the one or more removed codes.

“15. A computer-implemented method for recommending one or more codes for a physician code schedule, the method comprising: training a first encoder to encode codes into a first vector representation; training a second encoder to encode physician code schedules into a second vector representation; training a decoder to accept as input an encoded physician code schedule to output one or more predicted codes; recommending a code to add to a physician code schedule of a physician by: providing the physician code schedule; inputting the physician code schedule to the second encoder to obtain an encoded physician code schedule; inputting the encoded physician code schedule into the decoder to output a predicted code.

“16. The computer-implemented method of claim 15, wherein the first encoder creates word embeddings of codes by using a skip-gram model.

“17. The computer-implemented method of claim 15, wherein the second encoder is a long short-term memory neural network (LSTM), and the second vector representation is created based on the internal state of the second neural network encoder.

“18. The computer-implemented method of claim 15, wherein the second encoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second encoder; and training the second encoder to output the one or more removed codes.

“19. The computer-implemented method of claim 15, wherein the decoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second encoder to output an encoded ground-truth physician code schedule with the one or more codes removed; inputting the encoded ground-truth physician code schedule with the one or more codes removed in the decoder, and training the decoder to output the one or more removed codes.

“20. The computer-implemented method of claim 15, wherein the second encoder is a neural network comprising at least one node having a sigmoid activation function and at least one node having a tanh activation function.”

For additional information on this patent application, see: Kivatinos, Daniel; Nusimow, Michael; Borgt, Martin; Waychal, Soham. Neural Network Encoders And Decoders For Physician Practice Optimization. Filed June 28, 2019 and posted January 30, 2020. Patent URL: http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.html&r=1&f=G&l=50&s1=%2220200034707%22.PGNR.&OS=DN/20200034707&RS=DN/20200034707

(Our reports deliver fact-based news of research and discoveries from around the world.)

Older

Walla Walla County releases information about flood recovery requirements

Newer

National Women's Law Center Issues Public Comment on EEOC Proposed Rule

Advisor News

  • D.C. Digest: 'One Big Beautiful Bill' rebranded 'Working Families Tax Cut'
  • OBBBA and New Year’s resolutions
  • Do strong financial habits lead to better health?
  • Winona County approves 11% tax levy increase
  • Top firms’ 2026 market forecasts every financial advisor should know
More Advisor News

Annuity News

  • Judge denies new trial for Jeffrey Cutter on Advisors Act violation
  • Great-West Life & Annuity Insurance Company Trademark Application for “EMPOWER BENEFIT CONSULTING SERVICES” Filed: Great-West Life & Annuity Insurance Company
  • 2025 Top 5 Annuity Stories: Lawsuits, layoffs and Brighthouse sale rumors
  • An Application for the Trademark “DYNAMIC RETIREMENT MANAGER” Has Been Filed by Great-West Life & Annuity Insurance Company: Great-West Life & Annuity Insurance Company
  • Product understanding will drive the future of insurance
More Annuity News

Health/Employee Benefits News

  • ‘Egregious’: Idaho insurer says planned hospital’s practices could drive up costs
  • D.C. DIGEST
  • Medicaid agencies stepping up outreach
  • D.C. Digest: 'One Big Beautiful Bill' rebranded 'Working Families Tax Cut'
  • State employees got insurance without premiums
More Health/Employee Benefits News

Life Insurance News

  • One Bellevue Place changes hands for $90.3M
  • To attract Gen Z, insurance must rewrite its story
  • Baby On Board
  • 2025 Top 5 Life Insurance Stories: IUL takes center stage as lawsuits pile up
  • Private placement securities continue to be attractive to insurers
Sponsor
More Life Insurance News

- Presented By -

Top Read Stories

More Top Read Stories >

NEWS INSIDE

  • Companies
  • Earnings
  • Economic News
  • INN Magazine
  • Insurtech News
  • Newswires Feed
  • Regulation News
  • Washington Wire
  • Videos

FEATURED OFFERS

Elevate Your Practice with Pacific Life
Taking your business to the next level is easier when you have experienced support.

ICMG 2026: 3 Days to Transform Your Business
Speed Networking, deal-making, and insights that spark real growth — all in Miami.

Your trusted annuity partner.
Knighthead Life provides dependable annuities that help your clients retire with confidence.

8.5% Cap Guaranteed for the Full Term
Guaranteed cap rate for 5 & 7 years—no annual resets. Explore Oceanview CapLock FIA.

Press Releases

  • Two industry finance experts join National Life Group amid accelerated growth
  • National Life Group Announces Leadership Transition at Equity Services, Inc.
  • SandStone Insurance Partners Welcomes Industry Veteran, Rhonda Waskie, as Senior Account Executive
  • Springline Advisory Announces Partnership With Software And Consulting Firm Actuarial Resources Corporation
  • Insuraviews Closes New Funding Round Led by Idea Fund to Scale Market Intelligence Platform
More Press Releases > Add Your Press Release >

How to Write For InsuranceNewsNet

Find out how you can submit content for publishing on our website.
View Guidelines

Topics

  • Advisor News
  • Annuity Index
  • Annuity News
  • Companies
  • Earnings
  • Fiduciary
  • From the Field: Expert Insights
  • Health/Employee Benefits
  • Insurance & Financial Fraud
  • INN Magazine
  • Insiders Only
  • Life Insurance News
  • Newswires
  • Property and Casualty
  • Regulation News
  • Sponsored Articles
  • Washington Wire
  • Videos
  • ———
  • About
  • Advertise
  • Contact
  • Editorial Staff
  • Newsletters

Top Sections

  • AdvisorNews
  • Annuity News
  • Health/Employee Benefits News
  • InsuranceNewsNet Magazine
  • Life Insurance News
  • Property and Casualty News
  • Washington Wire

Our Company

  • About
  • Advertise
  • Contact
  • Meet our Editorial Staff
  • Magazine Subscription
  • Write for INN

Sign up for our FREE e-Newsletter!

Get breaking news, exclusive stories, and money- making insights straight into your inbox.

select Newsletter Options
Facebook Linkedin Twitter
© 2026 InsuranceNewsNet.com, Inc. All rights reserved.
  • Terms & Conditions
  • Privacy Policy
  • InsuranceNewsNet Magazine

Sign in with your Insider Pro Account

Not registered? Become an Insider Pro.
Insurance News | InsuranceNewsNet