Researchers Submit Patent Application, “Neural Network Encoders And Decoders For Physician Practice Optimization”, for Approval (USPTO 20200034707)
2020 FEB 17 (NewsRx) -- By a
The patent’s assignee is drchrono Inc. (
News editors obtained the following quote from the background information supplied by the inventors: “When physicians bill for a treatment, such as a procedure or office visit, they record the treatments using standardized codes. These standardized codes are submitted to the payer, such as an insurance company, Medicaid, or Medicare. For example, standardized codes may be used for procedures like examining the patient’s knee or setting a broken arm. Codes are sometimes numerical or alphanumeric. Some code systems include CPT, ICD-9, ICD-10, SNOMED, LOINC, RxNorm, HCPCS, and others. Some code systems are more specialized for diagnosis, while others are more specialized for procedures and payment. Some code systems have as many as tens of thousands of codes.
“Partly due to the overwhelming number of codes, physicians maintain fee schedules, which are lists of the codes commonly used in their practice. This allows physicians to see what codes they commonly use for particular procedures. For example, a procedure as simple as setting a broken ankle may involve multiple codes, rather than just one, and the physician refers to his or her fee schedule to determine what codes to record when submitting the bill to the payer.
“However, a problem in the art is that physicians may not know the best codes to include in their fee schedule and may not know all of the relevant codes to include in their fee schedule. This is an acute problem because of the large number of codes in many coding systems, and the opaqueness of payer policies that result in physicians not knowing which codes will be paid. One result of the problem is that new physicians may have difficulty setting up their own practices because they do not know how to correctly do billing, unless they have spent several years in training at an existing practice. Moreover, many physicians may not be optimizing their billings simply because they do not know the correct codes, leading some physicians to collect more payment for the same work for administrative reasons rather than quality of work.”
As a supplement to the background information on this patent application, NewsRx correspondents also obtained the inventors’ summary information for this patent application: “Some embodiments relate to a machine learning system for predicting billing codes for a physician fee schedule. The machine learning system can be used to suggest additional billing codes that may be added to a physician’s existing fee schedule. The predictions may be made based on the existing billing codes in the physician’s fee schedule and also the physician’s recently used billing codes in their recent billing claims.
“In one embodiment, a billing code encoder is provided that encodes billing codes into a first vector representation. In one embodiment, a fee schedule encoder is provided that encodes fee schedules into a second vector representation. In one embodiment, a billing claims encoder is provided that encodes sets of recent billing claims into a third vector representation. Each vector representation relates similar entities in a vector space by locating them closer together and causes dissimilar entities to be farther apart in the vector space.
“In one embodiment, a physician fee schedule is provided and a set of recent billing claims of the physician are provided. The individual billing codes in the fee schedule and set of recent billing claims are encoded using the billing code encoder. The fee schedule is then encoded by the fee schedule encoder, and the recent set of billing claims are encoded by the billing claims encoder. The encoded fee schedule and encoded set of recent billing claims may be input to a decoder to predict a billing code to add to the fee schedule.
“In one embodiment, the encoders and the decoder are machine learning models that are trained using ground-truth training examples.”
The claims supplied by the inventors are:
“1. A computer-implemented method for recommending one or more codes for a physician code schedule, the method comprising: training a first neural network encoder to encode codes into a first vector representation, the first vector representation relating codes that are similar; training a second neural network encoder to encode physician code schedules into a second vector representation, the second vector representation relating physician code schedules that are similar, wherein the physician code schedules comprise one or more codes; training a third neural network encoder to encode claims into a third vector representation, the third vector representation relating claims that are similar, wherein the claims comprise one or more codes; training a neural network decoder to accept as input an encoded physician code schedule and an encoded set of claims to output one or more predicted codes; recommending a code to add to a physician code schedule of a physician by: providing the physician code schedule; providing a set of recent claims of the physician; inputting the physician code schedule to the second neural network encoder to output an encoded physician code schedule; inputting the set of recent claims of the physician to the third neural network encoder to output an encoded set of recent claims of the physician; inputting the encoded physician code schedule and the encoded set of recent claims of the physician into the neural network decoder to output a predicted code.
“2. The computer-implemented method of claim 1, wherein the first neural network encoder creates word embeddings of codes by using a skip-gram model.
“3. The computer-implemented method of claim 1, wherein the second neural network encoder is a long short-term memory (LSTM) neural network, and the second vector representation is created based on the internal state of the second neural network encoder.
“4. The computer-implemented method of claim 1, wherein the second neural network encoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second neural network encoder; and training the second neural network encoder to output the one or more removed codes.
“5. The computer-implemented method of claim 1, wherein the third neural network encoder is a long short-term memory (LSTM) neural network, and the third vector representation is created based on the internal state of the third neural network encoder.
“6. The computer-implemented method of claim 1, wherein the third neural network encoder is trained by: removing one or more codes from a ground-truth set of recent claims; inputting the ground-truth set of recent claims with the one or more codes removed into the third neural network encoder; and training the third neural network encoder to output the one or more removed codes.
“7. The computer-implemented method of claim 1, wherein the neural network decoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second neural network encoder to output an encoded ground-truth physician code schedule with the one or more codes removed; providing a ground-truth set of recent claims; inputting the ground-truth set of recent claims into the third neural network encoder to output an encoded ground-truth set of recent claims; inputting the encoded ground-truth physician code schedule with the one or more codes removed and the encoded ground-truth set of recent claims in the neural network decoder, and training the neural network decoder to output the one or more removed codes.
“8. A computer-implemented method for recommending one or more codes for a physician code schedule, the method comprising: training a first encoder to encode codes into a first vector representation, the first vector representation relating codes that are similar; training a second encoder to encode physician code schedules into a second vector representation, the second vector representation relating physician code schedules that are similar, wherein the physician code schedules comprise one or more codes; training a third encoder to encode claims into a third vector representation, the third vector representation relating claims that are similar, wherein the claims comprise one or more codes; training a decoder to accept as input an encoded physician code schedule and an encoded set of claims to output one or more predicted codes; recommending a code to add to a physician code schedule of a physician by: providing the physician code schedule; providing a set of recent claims of the physician; inputting the physician code schedule to the second encoder to obtain an encoded physician code schedule; inputting the set of recent claims of the physician to the third encoder to obtain an encoded set of recent claims of the physician; inputting the encoded physician code schedule and the encoded set of recent claims of the physician into the decoder to output a predicted code.
“9. The computer-implemented method of claim 8, wherein the first encoder creates word embeddings of codes by using a skip-gram model.
“10. The computer-implemented method of claim 8, wherein the second encoder is a long short-term memory (LSTM) neural network, and the second vector representation is created based on the internal state of the second neural network encoder.
“11. The computer-implemented method of claim 8, wherein the second encoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second encoder; and training the second encoder to output the one or more removed codes.
“12. The computer-implemented method of claim 8, wherein the third encoder is a long short-term memory (LSTM) neural network, and the third vector representation is created based on the internal state of the third encoder.
“13. The computer-implemented method of claim 8, wherein the third encoder is trained by: removing one or more codes from a ground-truth set of recent claims; inputting the ground-truth set of recent claims with the one or more codes removed into the third encoder; and training the third encoder to output the one or more removed codes.
“14. The computer-implemented method of claim 8, wherein the decoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second encoder to output an encoded ground-truth physician code schedule with the one or more codes removed; providing a ground-truth set of recent claims; inputting the ground-truth set of recent claims into the third encoder to output an encoded ground-truth set of recent claims; inputting the encoded ground-truth physician code schedule with the one or more codes removed and the encoded ground-truth set of recent claims in the decoder, and training the decoder to output the one or more removed codes.
“15. A computer-implemented method for recommending one or more codes for a physician code schedule, the method comprising: training a first encoder to encode codes into a first vector representation; training a second encoder to encode physician code schedules into a second vector representation; training a decoder to accept as input an encoded physician code schedule to output one or more predicted codes; recommending a code to add to a physician code schedule of a physician by: providing the physician code schedule; inputting the physician code schedule to the second encoder to obtain an encoded physician code schedule; inputting the encoded physician code schedule into the decoder to output a predicted code.
“16. The computer-implemented method of claim 15, wherein the first encoder creates word embeddings of codes by using a skip-gram model.
“17. The computer-implemented method of claim 15, wherein the second encoder is a long short-term memory neural network (LSTM), and the second vector representation is created based on the internal state of the second neural network encoder.
“18. The computer-implemented method of claim 15, wherein the second encoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second encoder; and training the second encoder to output the one or more removed codes.
“19. The computer-implemented method of claim 15, wherein the decoder is trained by: removing one or more codes from a ground-truth physician code schedule; inputting the ground-truth physician code schedule with the one or more codes removed into the second encoder to output an encoded ground-truth physician code schedule with the one or more codes removed; inputting the encoded ground-truth physician code schedule with the one or more codes removed in the decoder, and training the decoder to output the one or more removed codes.
“20. The computer-implemented method of claim 15, wherein the second encoder is a neural network comprising at least one node having a sigmoid activation function and at least one node having a tanh activation function.”
For additional information on this patent application, see: Kivatinos, Daniel; Nusimow, Michael; Borgt, Martin; Waychal, Soham. Neural Network Encoders And Decoders For Physician Practice Optimization. Filed
(Our reports deliver fact-based news of research and discoveries from around the world.)



Walla Walla County releases information about flood recovery requirements
National Women's Law Center Issues Public Comment on EEOC Proposed Rule
Advisor News
- D.C. Digest: 'One Big Beautiful Bill' rebranded 'Working Families Tax Cut'
- OBBBA and New Year’s resolutions
- Do strong financial habits lead to better health?
- Winona County approves 11% tax levy increase
- Top firms’ 2026 market forecasts every financial advisor should know
More Advisor NewsAnnuity News
- Judge denies new trial for Jeffrey Cutter on Advisors Act violation
- Great-West Life & Annuity Insurance Company Trademark Application for “EMPOWER BENEFIT CONSULTING SERVICES” Filed: Great-West Life & Annuity Insurance Company
- 2025 Top 5 Annuity Stories: Lawsuits, layoffs and Brighthouse sale rumors
- An Application for the Trademark “DYNAMIC RETIREMENT MANAGER” Has Been Filed by Great-West Life & Annuity Insurance Company: Great-West Life & Annuity Insurance Company
- Product understanding will drive the future of insurance
More Annuity NewsHealth/Employee Benefits News
- ‘Egregious’: Idaho insurer says planned hospital’s practices could drive up costs
- D.C. DIGEST
- Medicaid agencies stepping up outreach
- D.C. Digest: 'One Big Beautiful Bill' rebranded 'Working Families Tax Cut'
- State employees got insurance without premiums
More Health/Employee Benefits NewsLife Insurance News