e-data
Advanced tools
| Metadata-Version: 2.4 | ||
| Name: e-data | ||
| Version: 1.3.1 | ||
| Summary: Python library for managing spanish energy data from various web providers | ||
| Author-email: VMG <vmayorg@outlook.es> | ||
| License: GNU GENERAL PUBLIC LICENSE | ||
| Version 3, 29 June 2007 | ||
| Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/> | ||
| Everyone is permitted to copy and distribute verbatim copies | ||
| of this license document, but changing it is not allowed. | ||
| Preamble | ||
| The GNU General Public License is a free, copyleft license for | ||
| software and other kinds of works. | ||
| The licenses for most software and other practical works are designed | ||
| to take away your freedom to share and change the works. By contrast, | ||
| the GNU General Public License is intended to guarantee your freedom to | ||
| share and change all versions of a program--to make sure it remains free | ||
| software for all its users. We, the Free Software Foundation, use the | ||
| GNU General Public License for most of our software; it applies also to | ||
| any other work released this way by its authors. You can apply it to | ||
| your programs, too. | ||
| When we speak of free software, we are referring to freedom, not | ||
| price. Our General Public Licenses are designed to make sure that you | ||
| have the freedom to distribute copies of free software (and charge for | ||
| them if you wish), that you receive source code or can get it if you | ||
| want it, that you can change the software or use pieces of it in new | ||
| free programs, and that you know you can do these things. | ||
| To protect your rights, we need to prevent others from denying you | ||
| these rights or asking you to surrender the rights. Therefore, you have | ||
| certain responsibilities if you distribute copies of the software, or if | ||
| you modify it: responsibilities to respect the freedom of others. | ||
| For example, if you distribute copies of such a program, whether | ||
| gratis or for a fee, you must pass on to the recipients the same | ||
| freedoms that you received. You must make sure that they, too, receive | ||
| or can get the source code. And you must show them these terms so they | ||
| know their rights. | ||
| Developers that use the GNU GPL protect your rights with two steps: | ||
| (1) assert copyright on the software, and (2) offer you this License | ||
| giving you legal permission to copy, distribute and/or modify it. | ||
| For the developers' and authors' protection, the GPL clearly explains | ||
| that there is no warranty for this free software. For both users' and | ||
| authors' sake, the GPL requires that modified versions be marked as | ||
| changed, so that their problems will not be attributed erroneously to | ||
| authors of previous versions. | ||
| Some devices are designed to deny users access to install or run | ||
| modified versions of the software inside them, although the manufacturer | ||
| can do so. This is fundamentally incompatible with the aim of | ||
| protecting users' freedom to change the software. The systematic | ||
| pattern of such abuse occurs in the area of products for individuals to | ||
| use, which is precisely where it is most unacceptable. Therefore, we | ||
| have designed this version of the GPL to prohibit the practice for those | ||
| products. If such problems arise substantially in other domains, we | ||
| stand ready to extend this provision to those domains in future versions | ||
| of the GPL, as needed to protect the freedom of users. | ||
| Finally, every program is threatened constantly by software patents. | ||
| States should not allow patents to restrict development and use of | ||
| software on general-purpose computers, but in those that do, we wish to | ||
| avoid the special danger that patents applied to a free program could | ||
| make it effectively proprietary. To prevent this, the GPL assures that | ||
| patents cannot be used to render the program non-free. | ||
| The precise terms and conditions for copying, distribution and | ||
| modification follow. | ||
| TERMS AND CONDITIONS | ||
| 0. Definitions. | ||
| "This License" refers to version 3 of the GNU General Public License. | ||
| "Copyright" also means copyright-like laws that apply to other kinds of | ||
| works, such as semiconductor masks. | ||
| "The Program" refers to any copyrightable work licensed under this | ||
| License. Each licensee is addressed as "you". "Licensees" and | ||
| "recipients" may be individuals or organizations. | ||
| To "modify" a work means to copy from or adapt all or part of the work | ||
| in a fashion requiring copyright permission, other than the making of an | ||
| exact copy. The resulting work is called a "modified version" of the | ||
| earlier work or a work "based on" the earlier work. | ||
| A "covered work" means either the unmodified Program or a work based | ||
| on the Program. | ||
| To "propagate" a work means to do anything with it that, without | ||
| permission, would make you directly or secondarily liable for | ||
| infringement under applicable copyright law, except executing it on a | ||
| computer or modifying a private copy. Propagation includes copying, | ||
| distribution (with or without modification), making available to the | ||
| public, and in some countries other activities as well. | ||
| To "convey" a work means any kind of propagation that enables other | ||
| parties to make or receive copies. Mere interaction with a user through | ||
| a computer network, with no transfer of a copy, is not conveying. | ||
| An interactive user interface displays "Appropriate Legal Notices" | ||
| to the extent that it includes a convenient and prominently visible | ||
| feature that (1) displays an appropriate copyright notice, and (2) | ||
| tells the user that there is no warranty for the work (except to the | ||
| extent that warranties are provided), that licensees may convey the | ||
| work under this License, and how to view a copy of this License. If | ||
| the interface presents a list of user commands or options, such as a | ||
| menu, a prominent item in the list meets this criterion. | ||
| 1. Source Code. | ||
| The "source code" for a work means the preferred form of the work | ||
| for making modifications to it. "Object code" means any non-source | ||
| form of a work. | ||
| A "Standard Interface" means an interface that either is an official | ||
| standard defined by a recognized standards body, or, in the case of | ||
| interfaces specified for a particular programming language, one that | ||
| is widely used among developers working in that language. | ||
| The "System Libraries" of an executable work include anything, other | ||
| than the work as a whole, that (a) is included in the normal form of | ||
| packaging a Major Component, but which is not part of that Major | ||
| Component, and (b) serves only to enable use of the work with that | ||
| Major Component, or to implement a Standard Interface for which an | ||
| implementation is available to the public in source code form. A | ||
| "Major Component", in this context, means a major essential component | ||
| (kernel, window system, and so on) of the specific operating system | ||
| (if any) on which the executable work runs, or a compiler used to | ||
| produce the work, or an object code interpreter used to run it. | ||
| The "Corresponding Source" for a work in object code form means all | ||
| the source code needed to generate, install, and (for an executable | ||
| work) run the object code and to modify the work, including scripts to | ||
| control those activities. However, it does not include the work's | ||
| System Libraries, or general-purpose tools or generally available free | ||
| programs which are used unmodified in performing those activities but | ||
| which are not part of the work. For example, Corresponding Source | ||
| includes interface definition files associated with source files for | ||
| the work, and the source code for shared libraries and dynamically | ||
| linked subprograms that the work is specifically designed to require, | ||
| such as by intimate data communication or control flow between those | ||
| subprograms and other parts of the work. | ||
| The Corresponding Source need not include anything that users | ||
| can regenerate automatically from other parts of the Corresponding | ||
| Source. | ||
| The Corresponding Source for a work in source code form is that | ||
| same work. | ||
| 2. Basic Permissions. | ||
| All rights granted under this License are granted for the term of | ||
| copyright on the Program, and are irrevocable provided the stated | ||
| conditions are met. This License explicitly affirms your unlimited | ||
| permission to run the unmodified Program. The output from running a | ||
| covered work is covered by this License only if the output, given its | ||
| content, constitutes a covered work. This License acknowledges your | ||
| rights of fair use or other equivalent, as provided by copyright law. | ||
| You may make, run and propagate covered works that you do not | ||
| convey, without conditions so long as your license otherwise remains | ||
| in force. You may convey covered works to others for the sole purpose | ||
| of having them make modifications exclusively for you, or provide you | ||
| with facilities for running those works, provided that you comply with | ||
| the terms of this License in conveying all material for which you do | ||
| not control copyright. Those thus making or running the covered works | ||
| for you must do so exclusively on your behalf, under your direction | ||
| and control, on terms that prohibit them from making any copies of | ||
| your copyrighted material outside their relationship with you. | ||
| Conveying under any other circumstances is permitted solely under | ||
| the conditions stated below. Sublicensing is not allowed; section 10 | ||
| makes it unnecessary. | ||
| 3. Protecting Users' Legal Rights From Anti-Circumvention Law. | ||
| No covered work shall be deemed part of an effective technological | ||
| measure under any applicable law fulfilling obligations under article | ||
| 11 of the WIPO copyright treaty adopted on 20 December 1996, or | ||
| similar laws prohibiting or restricting circumvention of such | ||
| measures. | ||
| When you convey a covered work, you waive any legal power to forbid | ||
| circumvention of technological measures to the extent such circumvention | ||
| is effected by exercising rights under this License with respect to | ||
| the covered work, and you disclaim any intention to limit operation or | ||
| modification of the work as a means of enforcing, against the work's | ||
| users, your or third parties' legal rights to forbid circumvention of | ||
| technological measures. | ||
| 4. Conveying Verbatim Copies. | ||
| You may convey verbatim copies of the Program's source code as you | ||
| receive it, in any medium, provided that you conspicuously and | ||
| appropriately publish on each copy an appropriate copyright notice; | ||
| keep intact all notices stating that this License and any | ||
| non-permissive terms added in accord with section 7 apply to the code; | ||
| keep intact all notices of the absence of any warranty; and give all | ||
| recipients a copy of this License along with the Program. | ||
| You may charge any price or no price for each copy that you convey, | ||
| and you may offer support or warranty protection for a fee. | ||
| 5. Conveying Modified Source Versions. | ||
| You may convey a work based on the Program, or the modifications to | ||
| produce it from the Program, in the form of source code under the | ||
| terms of section 4, provided that you also meet all of these conditions: | ||
| a) The work must carry prominent notices stating that you modified | ||
| it, and giving a relevant date. | ||
| b) The work must carry prominent notices stating that it is | ||
| released under this License and any conditions added under section | ||
| 7. This requirement modifies the requirement in section 4 to | ||
| "keep intact all notices". | ||
| c) You must license the entire work, as a whole, under this | ||
| License to anyone who comes into possession of a copy. This | ||
| License will therefore apply, along with any applicable section 7 | ||
| additional terms, to the whole of the work, and all its parts, | ||
| regardless of how they are packaged. This License gives no | ||
| permission to license the work in any other way, but it does not | ||
| invalidate such permission if you have separately received it. | ||
| d) If the work has interactive user interfaces, each must display | ||
| Appropriate Legal Notices; however, if the Program has interactive | ||
| interfaces that do not display Appropriate Legal Notices, your | ||
| work need not make them do so. | ||
| A compilation of a covered work with other separate and independent | ||
| works, which are not by their nature extensions of the covered work, | ||
| and which are not combined with it such as to form a larger program, | ||
| in or on a volume of a storage or distribution medium, is called an | ||
| "aggregate" if the compilation and its resulting copyright are not | ||
| used to limit the access or legal rights of the compilation's users | ||
| beyond what the individual works permit. Inclusion of a covered work | ||
| in an aggregate does not cause this License to apply to the other | ||
| parts of the aggregate. | ||
| 6. Conveying Non-Source Forms. | ||
| You may convey a covered work in object code form under the terms | ||
| of sections 4 and 5, provided that you also convey the | ||
| machine-readable Corresponding Source under the terms of this License, | ||
| in one of these ways: | ||
| a) Convey the object code in, or embodied in, a physical product | ||
| (including a physical distribution medium), accompanied by the | ||
| Corresponding Source fixed on a durable physical medium | ||
| customarily used for software interchange. | ||
| b) Convey the object code in, or embodied in, a physical product | ||
| (including a physical distribution medium), accompanied by a | ||
| written offer, valid for at least three years and valid for as | ||
| long as you offer spare parts or customer support for that product | ||
| model, to give anyone who possesses the object code either (1) a | ||
| copy of the Corresponding Source for all the software in the | ||
| product that is covered by this License, on a durable physical | ||
| medium customarily used for software interchange, for a price no | ||
| more than your reasonable cost of physically performing this | ||
| conveying of source, or (2) access to copy the | ||
| Corresponding Source from a network server at no charge. | ||
| c) Convey individual copies of the object code with a copy of the | ||
| written offer to provide the Corresponding Source. This | ||
| alternative is allowed only occasionally and noncommercially, and | ||
| only if you received the object code with such an offer, in accord | ||
| with subsection 6b. | ||
| d) Convey the object code by offering access from a designated | ||
| place (gratis or for a charge), and offer equivalent access to the | ||
| Corresponding Source in the same way through the same place at no | ||
| further charge. You need not require recipients to copy the | ||
| Corresponding Source along with the object code. If the place to | ||
| copy the object code is a network server, the Corresponding Source | ||
| may be on a different server (operated by you or a third party) | ||
| that supports equivalent copying facilities, provided you maintain | ||
| clear directions next to the object code saying where to find the | ||
| Corresponding Source. Regardless of what server hosts the | ||
| Corresponding Source, you remain obligated to ensure that it is | ||
| available for as long as needed to satisfy these requirements. | ||
| e) Convey the object code using peer-to-peer transmission, provided | ||
| you inform other peers where the object code and Corresponding | ||
| Source of the work are being offered to the general public at no | ||
| charge under subsection 6d. | ||
| A separable portion of the object code, whose source code is excluded | ||
| from the Corresponding Source as a System Library, need not be | ||
| included in conveying the object code work. | ||
| A "User Product" is either (1) a "consumer product", which means any | ||
| tangible personal property which is normally used for personal, family, | ||
| or household purposes, or (2) anything designed or sold for incorporation | ||
| into a dwelling. In determining whether a product is a consumer product, | ||
| doubtful cases shall be resolved in favor of coverage. For a particular | ||
| product received by a particular user, "normally used" refers to a | ||
| typical or common use of that class of product, regardless of the status | ||
| of the particular user or of the way in which the particular user | ||
| actually uses, or expects or is expected to use, the product. A product | ||
| is a consumer product regardless of whether the product has substantial | ||
| commercial, industrial or non-consumer uses, unless such uses represent | ||
| the only significant mode of use of the product. | ||
| "Installation Information" for a User Product means any methods, | ||
| procedures, authorization keys, or other information required to install | ||
| and execute modified versions of a covered work in that User Product from | ||
| a modified version of its Corresponding Source. The information must | ||
| suffice to ensure that the continued functioning of the modified object | ||
| code is in no case prevented or interfered with solely because | ||
| modification has been made. | ||
| If you convey an object code work under this section in, or with, or | ||
| specifically for use in, a User Product, and the conveying occurs as | ||
| part of a transaction in which the right of possession and use of the | ||
| User Product is transferred to the recipient in perpetuity or for a | ||
| fixed term (regardless of how the transaction is characterized), the | ||
| Corresponding Source conveyed under this section must be accompanied | ||
| by the Installation Information. But this requirement does not apply | ||
| if neither you nor any third party retains the ability to install | ||
| modified object code on the User Product (for example, the work has | ||
| been installed in ROM). | ||
| The requirement to provide Installation Information does not include a | ||
| requirement to continue to provide support service, warranty, or updates | ||
| for a work that has been modified or installed by the recipient, or for | ||
| the User Product in which it has been modified or installed. Access to a | ||
| network may be denied when the modification itself materially and | ||
| adversely affects the operation of the network or violates the rules and | ||
| protocols for communication across the network. | ||
| Corresponding Source conveyed, and Installation Information provided, | ||
| in accord with this section must be in a format that is publicly | ||
| documented (and with an implementation available to the public in | ||
| source code form), and must require no special password or key for | ||
| unpacking, reading or copying. | ||
| 7. Additional Terms. | ||
| "Additional permissions" are terms that supplement the terms of this | ||
| License by making exceptions from one or more of its conditions. | ||
| Additional permissions that are applicable to the entire Program shall | ||
| be treated as though they were included in this License, to the extent | ||
| that they are valid under applicable law. If additional permissions | ||
| apply only to part of the Program, that part may be used separately | ||
| under those permissions, but the entire Program remains governed by | ||
| this License without regard to the additional permissions. | ||
| When you convey a copy of a covered work, you may at your option | ||
| remove any additional permissions from that copy, or from any part of | ||
| it. (Additional permissions may be written to require their own | ||
| removal in certain cases when you modify the work.) You may place | ||
| additional permissions on material, added by you to a covered work, | ||
| for which you have or can give appropriate copyright permission. | ||
| Notwithstanding any other provision of this License, for material you | ||
| add to a covered work, you may (if authorized by the copyright holders of | ||
| that material) supplement the terms of this License with terms: | ||
| a) Disclaiming warranty or limiting liability differently from the | ||
| terms of sections 15 and 16 of this License; or | ||
| b) Requiring preservation of specified reasonable legal notices or | ||
| author attributions in that material or in the Appropriate Legal | ||
| Notices displayed by works containing it; or | ||
| c) Prohibiting misrepresentation of the origin of that material, or | ||
| requiring that modified versions of such material be marked in | ||
| reasonable ways as different from the original version; or | ||
| d) Limiting the use for publicity purposes of names of licensors or | ||
| authors of the material; or | ||
| e) Declining to grant rights under trademark law for use of some | ||
| trade names, trademarks, or service marks; or | ||
| f) Requiring indemnification of licensors and authors of that | ||
| material by anyone who conveys the material (or modified versions of | ||
| it) with contractual assumptions of liability to the recipient, for | ||
| any liability that these contractual assumptions directly impose on | ||
| those licensors and authors. | ||
| All other non-permissive additional terms are considered "further | ||
| restrictions" within the meaning of section 10. If the Program as you | ||
| received it, or any part of it, contains a notice stating that it is | ||
| governed by this License along with a term that is a further | ||
| restriction, you may remove that term. If a license document contains | ||
| a further restriction but permits relicensing or conveying under this | ||
| License, you may add to a covered work material governed by the terms | ||
| of that license document, provided that the further restriction does | ||
| not survive such relicensing or conveying. | ||
| If you add terms to a covered work in accord with this section, you | ||
| must place, in the relevant source files, a statement of the | ||
| additional terms that apply to those files, or a notice indicating | ||
| where to find the applicable terms. | ||
| Additional terms, permissive or non-permissive, may be stated in the | ||
| form of a separately written license, or stated as exceptions; | ||
| the above requirements apply either way. | ||
| 8. Termination. | ||
| You may not propagate or modify a covered work except as expressly | ||
| provided under this License. Any attempt otherwise to propagate or | ||
| modify it is void, and will automatically terminate your rights under | ||
| this License (including any patent licenses granted under the third | ||
| paragraph of section 11). | ||
| However, if you cease all violation of this License, then your | ||
| license from a particular copyright holder is reinstated (a) | ||
| provisionally, unless and until the copyright holder explicitly and | ||
| finally terminates your license, and (b) permanently, if the copyright | ||
| holder fails to notify you of the violation by some reasonable means | ||
| prior to 60 days after the cessation. | ||
| Moreover, your license from a particular copyright holder is | ||
| reinstated permanently if the copyright holder notifies you of the | ||
| violation by some reasonable means, this is the first time you have | ||
| received notice of violation of this License (for any work) from that | ||
| copyright holder, and you cure the violation prior to 30 days after | ||
| your receipt of the notice. | ||
| Termination of your rights under this section does not terminate the | ||
| licenses of parties who have received copies or rights from you under | ||
| this License. If your rights have been terminated and not permanently | ||
| reinstated, you do not qualify to receive new licenses for the same | ||
| material under section 10. | ||
| 9. Acceptance Not Required for Having Copies. | ||
| You are not required to accept this License in order to receive or | ||
| run a copy of the Program. Ancillary propagation of a covered work | ||
| occurring solely as a consequence of using peer-to-peer transmission | ||
| to receive a copy likewise does not require acceptance. However, | ||
| nothing other than this License grants you permission to propagate or | ||
| modify any covered work. These actions infringe copyright if you do | ||
| not accept this License. Therefore, by modifying or propagating a | ||
| covered work, you indicate your acceptance of this License to do so. | ||
| 10. Automatic Licensing of Downstream Recipients. | ||
| Each time you convey a covered work, the recipient automatically | ||
| receives a license from the original licensors, to run, modify and | ||
| propagate that work, subject to this License. You are not responsible | ||
| for enforcing compliance by third parties with this License. | ||
| An "entity transaction" is a transaction transferring control of an | ||
| organization, or substantially all assets of one, or subdividing an | ||
| organization, or merging organizations. If propagation of a covered | ||
| work results from an entity transaction, each party to that | ||
| transaction who receives a copy of the work also receives whatever | ||
| licenses to the work the party's predecessor in interest had or could | ||
| give under the previous paragraph, plus a right to possession of the | ||
| Corresponding Source of the work from the predecessor in interest, if | ||
| the predecessor has it or can get it with reasonable efforts. | ||
| You may not impose any further restrictions on the exercise of the | ||
| rights granted or affirmed under this License. For example, you may | ||
| not impose a license fee, royalty, or other charge for exercise of | ||
| rights granted under this License, and you may not initiate litigation | ||
| (including a cross-claim or counterclaim in a lawsuit) alleging that | ||
| any patent claim is infringed by making, using, selling, offering for | ||
| sale, or importing the Program or any portion of it. | ||
| 11. Patents. | ||
| A "contributor" is a copyright holder who authorizes use under this | ||
| License of the Program or a work on which the Program is based. The | ||
| work thus licensed is called the contributor's "contributor version". | ||
| A contributor's "essential patent claims" are all patent claims | ||
| owned or controlled by the contributor, whether already acquired or | ||
| hereafter acquired, that would be infringed by some manner, permitted | ||
| by this License, of making, using, or selling its contributor version, | ||
| but do not include claims that would be infringed only as a | ||
| consequence of further modification of the contributor version. For | ||
| purposes of this definition, "control" includes the right to grant | ||
| patent sublicenses in a manner consistent with the requirements of | ||
| this License. | ||
| Each contributor grants you a non-exclusive, worldwide, royalty-free | ||
| patent license under the contributor's essential patent claims, to | ||
| make, use, sell, offer for sale, import and otherwise run, modify and | ||
| propagate the contents of its contributor version. | ||
| In the following three paragraphs, a "patent license" is any express | ||
| agreement or commitment, however denominated, not to enforce a patent | ||
| (such as an express permission to practice a patent or covenant not to | ||
| sue for patent infringement). To "grant" such a patent license to a | ||
| party means to make such an agreement or commitment not to enforce a | ||
| patent against the party. | ||
| If you convey a covered work, knowingly relying on a patent license, | ||
| and the Corresponding Source of the work is not available for anyone | ||
| to copy, free of charge and under the terms of this License, through a | ||
| publicly available network server or other readily accessible means, | ||
| then you must either (1) cause the Corresponding Source to be so | ||
| available, or (2) arrange to deprive yourself of the benefit of the | ||
| patent license for this particular work, or (3) arrange, in a manner | ||
| consistent with the requirements of this License, to extend the patent | ||
| license to downstream recipients. "Knowingly relying" means you have | ||
| actual knowledge that, but for the patent license, your conveying the | ||
| covered work in a country, or your recipient's use of the covered work | ||
| in a country, would infringe one or more identifiable patents in that | ||
| country that you have reason to believe are valid. | ||
| If, pursuant to or in connection with a single transaction or | ||
| arrangement, you convey, or propagate by procuring conveyance of, a | ||
| covered work, and grant a patent license to some of the parties | ||
| receiving the covered work authorizing them to use, propagate, modify | ||
| or convey a specific copy of the covered work, then the patent license | ||
| you grant is automatically extended to all recipients of the covered | ||
| work and works based on it. | ||
| A patent license is "discriminatory" if it does not include within | ||
| the scope of its coverage, prohibits the exercise of, or is | ||
| conditioned on the non-exercise of one or more of the rights that are | ||
| specifically granted under this License. You may not convey a covered | ||
| work if you are a party to an arrangement with a third party that is | ||
| in the business of distributing software, under which you make payment | ||
| to the third party based on the extent of your activity of conveying | ||
| the work, and under which the third party grants, to any of the | ||
| parties who would receive the covered work from you, a discriminatory | ||
| patent license (a) in connection with copies of the covered work | ||
| conveyed by you (or copies made from those copies), or (b) primarily | ||
| for and in connection with specific products or compilations that | ||
| contain the covered work, unless you entered into that arrangement, | ||
| or that patent license was granted, prior to 28 March 2007. | ||
| Nothing in this License shall be construed as excluding or limiting | ||
| any implied license or other defenses to infringement that may | ||
| otherwise be available to you under applicable patent law. | ||
| 12. No Surrender of Others' Freedom. | ||
| If conditions are imposed on you (whether by court order, agreement or | ||
| otherwise) that contradict the conditions of this License, they do not | ||
| excuse you from the conditions of this License. If you cannot convey a | ||
| covered work so as to satisfy simultaneously your obligations under this | ||
| License and any other pertinent obligations, then as a consequence you may | ||
| not convey it at all. For example, if you agree to terms that obligate you | ||
| to collect a royalty for further conveying from those to whom you convey | ||
| the Program, the only way you could satisfy both those terms and this | ||
| License would be to refrain entirely from conveying the Program. | ||
| 13. Use with the GNU Affero General Public License. | ||
| Notwithstanding any other provision of this License, you have | ||
| permission to link or combine any covered work with a work licensed | ||
| under version 3 of the GNU Affero General Public License into a single | ||
| combined work, and to convey the resulting work. The terms of this | ||
| License will continue to apply to the part which is the covered work, | ||
| but the special requirements of the GNU Affero General Public License, | ||
| section 13, concerning interaction through a network will apply to the | ||
| combination as such. | ||
| 14. Revised Versions of this License. | ||
| The Free Software Foundation may publish revised and/or new versions of | ||
| the GNU General Public License from time to time. Such new versions will | ||
| be similar in spirit to the present version, but may differ in detail to | ||
| address new problems or concerns. | ||
| Each version is given a distinguishing version number. If the | ||
| Program specifies that a certain numbered version of the GNU General | ||
| Public License "or any later version" applies to it, you have the | ||
| option of following the terms and conditions either of that numbered | ||
| version or of any later version published by the Free Software | ||
| Foundation. If the Program does not specify a version number of the | ||
| GNU General Public License, you may choose any version ever published | ||
| by the Free Software Foundation. | ||
| If the Program specifies that a proxy can decide which future | ||
| versions of the GNU General Public License can be used, that proxy's | ||
| public statement of acceptance of a version permanently authorizes you | ||
| to choose that version for the Program. | ||
| Later license versions may give you additional or different | ||
| permissions. However, no additional obligations are imposed on any | ||
| author or copyright holder as a result of your choosing to follow a | ||
| later version. | ||
| 15. Disclaimer of Warranty. | ||
| THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY | ||
| APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT | ||
| HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY | ||
| OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, | ||
| THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR | ||
| PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM | ||
| IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF | ||
| ALL NECESSARY SERVICING, REPAIR OR CORRECTION. | ||
| 16. Limitation of Liability. | ||
| IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING | ||
| WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS | ||
| THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY | ||
| GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE | ||
| USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF | ||
| DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD | ||
| PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), | ||
| EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF | ||
| SUCH DAMAGES. | ||
| 17. Interpretation of Sections 15 and 16. | ||
| If the disclaimer of warranty and limitation of liability provided | ||
| above cannot be given local legal effect according to their terms, | ||
| reviewing courts shall apply local law that most closely approximates | ||
| an absolute waiver of all civil liability in connection with the | ||
| Program, unless a warranty or assumption of liability accompanies a | ||
| copy of the Program in return for a fee. | ||
| END OF TERMS AND CONDITIONS | ||
| How to Apply These Terms to Your New Programs | ||
| If you develop a new program, and you want it to be of the greatest | ||
| possible use to the public, the best way to achieve this is to make it | ||
| free software which everyone can redistribute and change under these terms. | ||
| To do so, attach the following notices to the program. It is safest | ||
| to attach them to the start of each source file to most effectively | ||
| state the exclusion of warranty; and each file should have at least | ||
| the "copyright" line and a pointer to where the full notice is found. | ||
| <one line to give the program's name and a brief idea of what it does.> | ||
| Copyright (C) <year> <name of author> | ||
| This program is free software: you can redistribute it and/or modify | ||
| it under the terms of the GNU General Public License as published by | ||
| the Free Software Foundation, either version 3 of the License, or | ||
| (at your option) any later version. | ||
| This program is distributed in the hope that it will be useful, | ||
| but WITHOUT ANY WARRANTY; without even the implied warranty of | ||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | ||
| GNU General Public License for more details. | ||
| You should have received a copy of the GNU General Public License | ||
| along with this program. If not, see <https://www.gnu.org/licenses/>. | ||
| Also add information on how to contact you by electronic and paper mail. | ||
| If the program does terminal interaction, make it output a short | ||
| notice like this when it starts in an interactive mode: | ||
| <program> Copyright (C) <year> <name of author> | ||
| This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. | ||
| This is free software, and you are welcome to redistribute it | ||
| under certain conditions; type `show c' for details. | ||
| The hypothetical commands `show w' and `show c' should show the appropriate | ||
| parts of the General Public License. Of course, your program's commands | ||
| might be different; for a GUI interface, you would use an "about box". | ||
| You should also get your employer (if you work as a programmer) or school, | ||
| if any, to sign a "copyright disclaimer" for the program, if necessary. | ||
| For more information on this, and how to apply and follow the GNU GPL, see | ||
| <https://www.gnu.org/licenses/>. | ||
| The GNU General Public License does not permit incorporating your program | ||
| into proprietary programs. If your program is a subroutine library, you | ||
| may consider it more useful to permit linking proprietary applications with | ||
| the library. If this is what you want to do, use the GNU Lesser General | ||
| Public License instead of this License. But first, please read | ||
| <https://www.gnu.org/licenses/why-not-lgpl.html>. | ||
| Project-URL: Homepage, https://github.com/uvejota/python-edata | ||
| Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3) | ||
| Classifier: Programming Language :: Python | ||
| Classifier: Programming Language :: Python :: 3.11 | ||
| Classifier: Programming Language :: Python :: 3.12 | ||
| Classifier: Programming Language :: Python :: Implementation :: CPython | ||
| Classifier: Programming Language :: Python :: Implementation :: PyPy | ||
| Requires-Python: >=3.11.0 | ||
| Description-Content-Type: text/markdown | ||
| License-File: LICENSE | ||
| Requires-Dist: dateparser>=1.1.2 | ||
| Requires-Dist: freezegun>=1.2.1 | ||
| Requires-Dist: holidays>=0.14.2 | ||
| Requires-Dist: pytest>=7.1.2 | ||
| Requires-Dist: python_dateutil>=2.8.2 | ||
| Requires-Dist: requests>=2.28.1 | ||
| Requires-Dist: voluptuous>=0.13.1 | ||
| Requires-Dist: Jinja2>=3.1.2 | ||
| Requires-Dist: diskcache>=5.6.3 | ||
| Requires-Dist: aiohttp>=3.12.15 | ||
| Dynamic: license-file | ||
| [](https://pepy.tech/project/e-data) | ||
| [](https://pepy.tech/project/e-data) | ||
| [](https://pepy.tech/project/e-data) | ||
| # python-edata | ||
| Este paquete proporciona herramientas para la descarga de tus datos de consumo eléctrico (desde Datadis.es) y su posterior procesado. La motivación principal es que conocer el consumo puede ayudarnos a reducirlo, e incluso a elegir una tarifa que mejor se adapte a nuestras necesidades. A día de hoy sus capacidades de facturación (€) son limitadas, soporta PVPC (según disponibilidad de datos de REData) y tarificación fija por tramos. Es el corazón de la integración [homeassistant-edata](https://github.com/uvejota/homeassistant-edata). | ||
| _**Esta herramienta no mantiene ningún tipo de vinculación con los proveedores de datos anteriormente mencionados, simplemente consulta la información disponible y facilita su posterior análisis.**_ | ||
| ## Instalación | ||
| Puedes instalar la última versión estable mediante: | ||
| ``` bash | ||
| pip install e-data | ||
| ``` | ||
| Si quieres probar la versión `dev` o contribuir a su desarrollo, clona este repositorio e instala manualmente las dependencias: | ||
| ``` bash | ||
| pip install -r requirements.txt | ||
| ``` | ||
| ## Estructura | ||
| El paquete consta de tres módulos diferenciados: | ||
| * **Conectores** (módulo `connectors`), para definir los métodos de consulta a los diferentes proveedores: Datadis y REData. | ||
| * **Procesadores** (módulo `processors`), para procesar datos de consumo, maxímetro, o coste (tarificación). Ahora mismo consta de tres procesadores: `billing`, `consumption` y `maximeter`, además de algunas utilidades ubicadas en `utils`. Los procesadores deben heredar de la clase Processor definida en `base.py` | ||
| * **Ayudantes** (módulo `helpers`), para ayudar en el uso y gestión de los anteriores, presentando de momento un único ayudante llamado `EdataHelper` que te permite recopilar `X` días de datos (por defecto 365) y automáticamente procesarlos. Los datos son almacenados en la variable `data`, mientras que los atributos autocalculados son almacenados en la variable `attributes`. Por lo general, primero utilizan los conectores y luego procesan los datos, gestionando varias tareas de recuperación (principalmente para Datadis). | ||
| Estos módulos corresponden a la siguiente estructura del paquete: | ||
| ``` | ||
| edata/ | ||
| · __init__.py | ||
| · connectors/ | ||
| · __init__.py | ||
| · datadis.py | ||
| · redata.py | ||
| · processors/ | ||
| · __init__.py | ||
| · base.py | ||
| · billing.py | ||
| · consumption.py | ||
| · maximeter.py | ||
| · utils.py | ||
| · helpers.py | ||
| ``` | ||
| ## Ejemplo de uso | ||
| Partimos de que tenemos credenciales en Datadis.es. Algunas aclaraciones: | ||
| * No es necesario solicitar API pública en el registro (se utilizará la API privada habilitada por defecto) | ||
| * El username suele ser el NIF del titular | ||
| * Copie el CUPS de la web de Datadis, algunas comercializadoras adhieren caracteres adicionales en el CUPS mostrado en su factura. | ||
| * La herramienta acepta el uso de NIF autorizado para consultar el suministro de otro titular. | ||
| ``` python | ||
| from datetime import datetime | ||
| import json | ||
| # importamos definiciones de datos que nos interesen | ||
| from edata.definitions import PricingRules | ||
| # importamos el ayudante | ||
| from edata.helpers import EdataHelper | ||
| # importamos el procesador de utilidades | ||
| from edata.processors import utils | ||
| # Preparar reglas de tarificación (si se quiere) | ||
| PRICING_RULES_PVPC = PricingRules( | ||
| p1_kw_year_eur=30.67266, | ||
| p2_kw_year_eur=1.4243591, | ||
| meter_month_eur=0.81, | ||
| market_kw_year_eur=3.113, | ||
| electricity_tax=1.0511300560, | ||
| iva_tax=1.05, | ||
| # podemos rellenar los siguientes campos si quisiéramos precio fijo (y no pvpc) | ||
| p1_kwh_eur=None, | ||
| p2_kwh_eur=None, | ||
| p3_kwh_eur=None, | ||
| ) | ||
| # Instanciar el helper | ||
| # 'authorized_nif' permite indicar el NIF de la persona que nos autoriza a consultar su CUPS. | ||
| # 'data' permite "cargar" al helper datos anteriores (resultado edata.data de una ejecución anterior), para evitar volver a consultar los mismos. | ||
| edata = EdataHelper( | ||
| "datadis_user", | ||
| "datadis_password", | ||
| "cups", | ||
| datadis_authorized_nif=None, | ||
| pricing_rules=PRICING_RULES_PVPC, # si se le pasa None, no aplica tarificación | ||
| data=None, # aquí podríamos cargar datos anteriores | ||
| ) | ||
| # Solicitar actualización de todo el histórico (se almacena en edata.data) | ||
| edata.update(date_from=datetime(1970, 1, 1), date_to=datetime.today()) | ||
| # volcamos todo lo obtenido a un fichero | ||
| with open("backup.json", "w") as file: | ||
| json.dump(utils.serialize_dict(edata.data), file) # se puede utilizar deserialize_dict para la posterior lectura del backup | ||
| # Imprimir atributos | ||
| print(edata.attributes) | ||
| ``` |
| dateparser>=1.1.2 | ||
| freezegun>=1.2.1 | ||
| holidays>=0.14.2 | ||
| pytest>=7.1.2 | ||
| python_dateutil>=2.8.2 | ||
| requests>=2.28.1 | ||
| voluptuous>=0.13.1 | ||
| Jinja2>=3.1.2 | ||
| diskcache>=5.6.3 | ||
| aiohttp>=3.12.15 |
| LICENSE | ||
| MANIFEST.in | ||
| README.md | ||
| pyproject.toml | ||
| edata/connectors/__init__.py | ||
| edata/connectors/datadis.py | ||
| edata/connectors/redata.py | ||
| edata/e_data.egg-info/PKG-INFO | ||
| edata/e_data.egg-info/SOURCES.txt | ||
| edata/e_data.egg-info/dependency_links.txt | ||
| edata/e_data.egg-info/requires.txt | ||
| edata/e_data.egg-info/top_level.txt | ||
| edata/processors/__init__.py | ||
| edata/processors/base.py | ||
| edata/processors/billing.py | ||
| edata/processors/consumption.py | ||
| edata/processors/maximeter.py | ||
| edata/processors/utils.py | ||
| edata/tests/__init__.py | ||
| edata/tests/test_datadis_connector.py | ||
| edata/tests/test_helpers.py | ||
| edata/tests/test_processors.py | ||
| edata/tests/test_redata_connector.py |
| connectors | ||
| processors | ||
| tests |
| """Base definitions for processors.""" | ||
| from abc import ABC, abstractmethod | ||
| from copy import deepcopy | ||
| from typing import Any | ||
| class Processor(ABC): | ||
| """A base class for data processors.""" | ||
| _LABEL = "Processor" | ||
| def __init__(self, input_data: Any, auto: bool = True) -> None: | ||
| """Init method.""" | ||
| self._input = deepcopy(input_data) | ||
| self._output = None | ||
| if auto: | ||
| self.do_process() | ||
| @abstractmethod | ||
| def do_process(self): | ||
| """Process method.""" | ||
| @property | ||
| def output(self): | ||
| """An output property.""" | ||
| return deepcopy(self._output) |
| """Billing data processors.""" | ||
| import contextlib | ||
| from datetime import datetime, timedelta | ||
| import logging | ||
| from typing import Optional, TypedDict | ||
| from jinja2 import Environment | ||
| import voluptuous | ||
| from ..definitions import ( | ||
| ConsumptionData, | ||
| ConsumptionSchema, | ||
| ContractData, | ||
| ContractSchema, | ||
| PricingAggData, | ||
| PricingData, | ||
| PricingRules, | ||
| PricingRulesSchema, | ||
| PricingSchema, | ||
| ) | ||
| from ..processors import utils | ||
| from ..processors.base import Processor | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class BillingOutput(TypedDict): | ||
| """A dict holding BillingProcessor output property.""" | ||
| hourly: list[PricingAggData] | ||
| daily: list[PricingAggData] | ||
| monthly: list[PricingAggData] | ||
| class BillingInput(TypedDict): | ||
| """A dict holding BillingProcessor input data.""" | ||
| contracts: list[ContractData] | ||
| consumptions: list[ConsumptionData] | ||
| prices: Optional[list[PricingData]] | ||
| rules: PricingRules | ||
| class BillingProcessor(Processor): | ||
| """A billing processor for edata.""" | ||
| def do_process(self): | ||
| """Process billing and get hourly/daily/monthly metrics.""" | ||
| self._output = BillingOutput(hourly=[], daily=[], monthly=[]) | ||
| _schema = voluptuous.Schema( | ||
| { | ||
| voluptuous.Required("contracts"): [ContractSchema], | ||
| voluptuous.Required("consumptions"): [ConsumptionSchema], | ||
| voluptuous.Optional("prices", default=None): voluptuous.Union( | ||
| [voluptuous.Union(PricingSchema)], None | ||
| ), | ||
| voluptuous.Required("rules"): PricingRulesSchema, | ||
| } | ||
| ) | ||
| self._input = _schema(self._input) | ||
| self._cycle_offset = self._input["rules"]["cycle_start_day"] - 1 | ||
| # joint data by datetime | ||
| _data = { | ||
| x["datetime"]: { | ||
| "datetime": x["datetime"], | ||
| "kwh": x["value_kWh"], | ||
| "surplus_kwh": x["surplus_kWh"] if x["surplus_kWh"] is not None else 0, | ||
| } | ||
| for x in self._input["consumptions"] | ||
| } | ||
| for contract in self._input["contracts"]: | ||
| start = contract["date_start"] | ||
| end = contract["date_end"] | ||
| finish = False | ||
| while not finish: | ||
| if start in _data: | ||
| _data[start]["p1_kw"] = contract["power_p1"] | ||
| _data[start]["p2_kw"] = contract["power_p2"] | ||
| start = start + timedelta(hours=1) | ||
| finish = not (end > start) | ||
| if self._input["prices"]: | ||
| for x in self._input["prices"]: | ||
| start = x["datetime"] | ||
| if start in _data: | ||
| _data[start]["kwh_eur"] = x["value_eur_kWh"] | ||
| env = Environment() | ||
| energy_expr = env.compile_expression( | ||
| f'({self._input["rules"]["energy_formula"]})|float' | ||
| ) | ||
| power_expr = env.compile_expression( | ||
| f'({self._input["rules"]["power_formula"]})|float' | ||
| ) | ||
| others_expr = env.compile_expression( | ||
| f'({self._input["rules"]["others_formula"]})|float' | ||
| ) | ||
| surplus_expr = env.compile_expression( | ||
| f'({self._input["rules"]["surplus_formula"]})|float' | ||
| ) | ||
| main_expr = env.compile_expression( | ||
| f'({self._input["rules"]["main_formula"]})|float' | ||
| ) | ||
| _data = sorted([_data[x] for x in _data], key=lambda x: x["datetime"]) | ||
| hourly = [] | ||
| for x in _data: | ||
| x.update(self._input["rules"]) | ||
| tariff = utils.get_pvpc_tariff(x["datetime"]) | ||
| if "kwh_eur" not in x: | ||
| if tariff == "p1": | ||
| x["kwh_eur"] = x["p1_kwh_eur"] | ||
| elif tariff == "p2": | ||
| x["kwh_eur"] = x["p2_kwh_eur"] | ||
| elif tariff == "p3": | ||
| x["kwh_eur"] = x["p3_kwh_eur"] | ||
| if x["kwh_eur"] is None: | ||
| continue | ||
| if tariff == "p1": | ||
| x["surplus_kwh_eur"] = x["surplus_p1_kwh_eur"] | ||
| elif tariff == "p2": | ||
| x["surplus_kwh_eur"] = x["surplus_p2_kwh_eur"] | ||
| elif tariff == "p3": | ||
| x["surplus_kwh_eur"] = x["surplus_p3_kwh_eur"] | ||
| _energy_term = 0 | ||
| _power_term = 0 | ||
| _others_term = 0 | ||
| _surplus_term = 0 | ||
| with contextlib.suppress(Exception): | ||
| _energy_term = round(energy_expr(**x), 6) | ||
| _power_term = round(power_expr(**x), 6) | ||
| _others_term = round(others_expr(**x), 6) | ||
| _surplus_term = round(surplus_expr(**x), 6) | ||
| new_item = PricingAggData( | ||
| datetime=x["datetime"], | ||
| energy_term=_energy_term, | ||
| power_term=_power_term, | ||
| others_term=_others_term, | ||
| surplus_term=_surplus_term, | ||
| value_eur=0, | ||
| delta_h=1, | ||
| ) | ||
| hourly.append(new_item) | ||
| self._output["hourly"] = hourly | ||
| last_day_dt = None | ||
| last_month_dt = None | ||
| for hour in hourly: | ||
| curr_hour_dt: datetime = hour["datetime"] | ||
| curr_day_dt = curr_hour_dt.replace(hour=0, minute=0, second=0) | ||
| curr_month_dt = (curr_day_dt - timedelta(days=self._cycle_offset)).replace( | ||
| day=1 | ||
| ) | ||
| if last_day_dt is None or curr_day_dt != last_day_dt: | ||
| self._output["daily"].append( | ||
| PricingAggData( | ||
| datetime=curr_day_dt, | ||
| energy_term=hour["energy_term"], | ||
| power_term=hour["power_term"], | ||
| others_term=hour["others_term"], | ||
| surplus_term=hour["surplus_term"], | ||
| value_eur=hour["value_eur"], | ||
| delta_h=hour["delta_h"], | ||
| ) | ||
| ) | ||
| else: | ||
| self._output["daily"][-1]["energy_term"] += hour["energy_term"] | ||
| self._output["daily"][-1]["power_term"] += hour["power_term"] | ||
| self._output["daily"][-1]["others_term"] += hour["others_term"] | ||
| self._output["daily"][-1]["surplus_term"] += hour["surplus_term"] | ||
| self._output["daily"][-1]["delta_h"] += hour["delta_h"] | ||
| self._output["daily"][-1]["value_eur"] += hour["value_eur"] | ||
| if last_month_dt is None or curr_month_dt != last_month_dt: | ||
| self._output["monthly"].append( | ||
| PricingAggData( | ||
| datetime=curr_month_dt, | ||
| energy_term=hour["energy_term"], | ||
| power_term=hour["power_term"], | ||
| others_term=hour["others_term"], | ||
| surplus_term=hour["surplus_term"], | ||
| value_eur=hour["value_eur"], | ||
| delta_h=hour["delta_h"], | ||
| ) | ||
| ) | ||
| else: | ||
| self._output["monthly"][-1]["energy_term"] += hour["energy_term"] | ||
| self._output["monthly"][-1]["power_term"] += hour["power_term"] | ||
| self._output["monthly"][-1]["others_term"] += hour["others_term"] | ||
| self._output["monthly"][-1]["surplus_term"] += hour["surplus_term"] | ||
| self._output["monthly"][-1]["value_eur"] += hour["value_eur"] | ||
| self._output["monthly"][-1]["delta_h"] += hour["delta_h"] | ||
| last_day_dt = curr_day_dt | ||
| last_month_dt = curr_month_dt | ||
| for item in self._output: | ||
| for cost in self._output[item]: | ||
| cost["value_eur"] = round(main_expr(**cost, **self._input["rules"]), 6) | ||
| cost["energy_term"] = round(cost["energy_term"], 6) | ||
| cost["power_term"] = round(cost["power_term"], 6) | ||
| cost["others_term"] = round(cost["others_term"], 6) | ||
| cost["surplus_term"] = round(cost["surplus_term"], 6) |
| """Consumption data processors.""" | ||
| import logging | ||
| from collections.abc import Iterable | ||
| from typing import TypedDict | ||
| from datetime import datetime, timedelta | ||
| import voluptuous | ||
| from ..definitions import ConsumptionAggData, ConsumptionSchema | ||
| from . import utils | ||
| from .base import Processor | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class ConsumptionOutput(TypedDict): | ||
| """A dict holding ConsumptionProcessor output property.""" | ||
| daily: Iterable[ConsumptionAggData] | ||
| monthly: Iterable[ConsumptionAggData] | ||
| class ConsumptionProcessor(Processor): | ||
| """A consumptions processor.""" | ||
| def do_process(self): | ||
| """Calculate daily and monthly consumption stats.""" | ||
| self._output = ConsumptionOutput(daily=[], monthly=[]) | ||
| last_day_dt = None | ||
| last_month_dt = None | ||
| _schema = voluptuous.Schema( | ||
| { | ||
| voluptuous.Required("consumptions"): [ConsumptionSchema], | ||
| voluptuous.Optional("cycle_start_day", default=1): voluptuous.Range( | ||
| 1, 30 | ||
| ), | ||
| } | ||
| ) | ||
| self._input = _schema(self._input) | ||
| self._cycle_offset = self._input["cycle_start_day"] - 1 | ||
| for consumption in self._input["consumptions"]: | ||
| curr_hour_dt: datetime = consumption["datetime"] | ||
| curr_day_dt = curr_hour_dt.replace(hour=0, minute=0, second=0) | ||
| curr_month_dt = (curr_day_dt - timedelta(days=self._cycle_offset)).replace( | ||
| day=1 | ||
| ) | ||
| tariff = utils.get_pvpc_tariff(curr_hour_dt) | ||
| kwh = consumption["value_kWh"] | ||
| surplus_kwh = consumption["surplus_kWh"] | ||
| delta_h = consumption["delta_h"] | ||
| kwh_by_tariff = [0, 0, 0] | ||
| surplus_kwh_by_tariff = [0, 0, 0] | ||
| match tariff: | ||
| case "p1": | ||
| kwh_by_tariff[0] = kwh | ||
| surplus_kwh_by_tariff[0] = surplus_kwh | ||
| case "p2": | ||
| kwh_by_tariff[1] = kwh | ||
| surplus_kwh_by_tariff[1] = surplus_kwh | ||
| case "p3": | ||
| kwh_by_tariff[2] = kwh | ||
| surplus_kwh_by_tariff[2] = surplus_kwh | ||
| if last_day_dt is None or curr_day_dt != last_day_dt: | ||
| self._output["daily"].append( | ||
| ConsumptionAggData( | ||
| datetime=curr_day_dt, | ||
| value_kWh=kwh, | ||
| delta_h=delta_h, | ||
| value_p1_kWh=kwh_by_tariff[0], | ||
| value_p2_kWh=kwh_by_tariff[1], | ||
| value_p3_kWh=kwh_by_tariff[2], | ||
| surplus_kWh=surplus_kwh, | ||
| surplus_p1_kWh=surplus_kwh_by_tariff[0], | ||
| surplus_p2_kWh=surplus_kwh_by_tariff[1], | ||
| surplus_p3_kWh=surplus_kwh_by_tariff[2], | ||
| ) | ||
| ) | ||
| else: | ||
| self._output["daily"][-1]["value_kWh"] += kwh | ||
| self._output["daily"][-1]["value_p1_kWh"] += kwh_by_tariff[0] | ||
| self._output["daily"][-1]["value_p2_kWh"] += kwh_by_tariff[1] | ||
| self._output["daily"][-1]["value_p3_kWh"] += kwh_by_tariff[2] | ||
| self._output["daily"][-1]["surplus_kWh"] += surplus_kwh | ||
| self._output["daily"][-1]["surplus_p1_kWh"] += surplus_kwh_by_tariff[0] | ||
| self._output["daily"][-1]["surplus_p2_kWh"] += surplus_kwh_by_tariff[1] | ||
| self._output["daily"][-1]["surplus_p3_kWh"] += surplus_kwh_by_tariff[2] | ||
| self._output["daily"][-1]["delta_h"] += delta_h | ||
| if last_month_dt is None or curr_month_dt != last_month_dt: | ||
| self._output["monthly"].append( | ||
| ConsumptionAggData( | ||
| datetime=curr_month_dt, | ||
| value_kWh=kwh, | ||
| delta_h=delta_h, | ||
| value_p1_kWh=kwh_by_tariff[0], | ||
| value_p2_kWh=kwh_by_tariff[1], | ||
| value_p3_kWh=kwh_by_tariff[2], | ||
| surplus_kWh=surplus_kwh, | ||
| surplus_p1_kWh=surplus_kwh_by_tariff[0], | ||
| surplus_p2_kWh=surplus_kwh_by_tariff[1], | ||
| surplus_p3_kWh=surplus_kwh_by_tariff[2], | ||
| ) | ||
| ) | ||
| else: | ||
| self._output["monthly"][-1]["value_kWh"] += kwh | ||
| self._output["monthly"][-1]["value_p1_kWh"] += kwh_by_tariff[0] | ||
| self._output["monthly"][-1]["value_p2_kWh"] += kwh_by_tariff[1] | ||
| self._output["monthly"][-1]["value_p3_kWh"] += kwh_by_tariff[2] | ||
| self._output["monthly"][-1]["surplus_kWh"] += surplus_kwh | ||
| self._output["monthly"][-1]["surplus_p1_kWh"] += surplus_kwh_by_tariff[ | ||
| 0 | ||
| ] | ||
| self._output["monthly"][-1]["surplus_p2_kWh"] += surplus_kwh_by_tariff[ | ||
| 1 | ||
| ] | ||
| self._output["monthly"][-1]["surplus_p3_kWh"] += surplus_kwh_by_tariff[ | ||
| 2 | ||
| ] | ||
| self._output["monthly"][-1]["delta_h"] += delta_h | ||
| last_day_dt = curr_day_dt | ||
| last_month_dt = curr_month_dt | ||
| # Round to two decimals | ||
| for item in self._output: | ||
| for cons in self._output[item]: | ||
| for key in cons: | ||
| if isinstance(cons[key], float): | ||
| cons[key] = round(cons[key], 2) |
| """Maximeter data processors.""" | ||
| import logging | ||
| from datetime import datetime | ||
| from typing import TypedDict | ||
| from dateparser import parse | ||
| import voluptuous | ||
| from edata.definitions import MaxPowerSchema | ||
| from edata.processors import utils | ||
| from .base import Processor | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class MaximeterStats(TypedDict): | ||
| """A dict holding MaximeterProcessor stats.""" | ||
| value_max_kW: float | ||
| date_max: datetime | ||
| value_mean_kW: float | ||
| value_tile90_kW: float | ||
| class MaximeterOutput(TypedDict): | ||
| """A dict holding MaximeterProcessor output property.""" | ||
| stats: MaximeterStats | ||
| class MaximeterProcessor(Processor): | ||
| """A processor for Maximeter data.""" | ||
| def do_process(self): | ||
| """Calculate maximeter stats.""" | ||
| self._output = MaximeterOutput(stats={}) | ||
| _schema = voluptuous.Schema([MaxPowerSchema]) | ||
| self._input = _schema(self._input) | ||
| _values = [x["value_kW"] for x in self._input] | ||
| _max_kW = max(_values) | ||
| _dt_max_kW = parse(str(self._input[_values.index(_max_kW)]["datetime"])) | ||
| _mean_kW = sum(_values) / len(_values) | ||
| _tile90_kW = utils.percentile(_values, 0.9) | ||
| self._output["stats"] = MaximeterOutput( | ||
| value_max_kW=round(_max_kW, 2), | ||
| date_max=_dt_max_kW, | ||
| value_mean_kW=round(_mean_kW, 2), | ||
| value_tile90_kW=round(_tile90_kW, 2), | ||
| ) |
| """Generic utilities for processing data.""" | ||
| import json | ||
| import logging | ||
| from copy import deepcopy | ||
| from datetime import date, datetime, timedelta | ||
| from json import JSONEncoder | ||
| import holidays | ||
| import math | ||
| import functools | ||
| import contextlib | ||
| _LOGGER = logging.getLogger(__name__) | ||
| logging.basicConfig(level=logging.INFO) | ||
| HOURS_P1 = [10, 11, 12, 13, 18, 19, 20, 21] | ||
| HOURS_P2 = [8, 9, 14, 15, 16, 17, 22, 23] | ||
| WEEKDAYS_P3 = [5, 6] | ||
| def is_empty(lst): | ||
| """Check if a list is empty.""" | ||
| return len(lst) == 0 | ||
| def extract_dt_ranges(lst, dt_from, dt_to, gap_interval=timedelta(hours=1)): | ||
| """Filter a list of dicts between two datetimes.""" | ||
| new_lst = [] | ||
| missing = [] | ||
| oldest_dt = None | ||
| newest_dt = None | ||
| last_dt = None | ||
| if len(lst) > 0: | ||
| sorted_lst = sorted(lst, key=lambda i: i["datetime"]) | ||
| last_dt = dt_from | ||
| for i in sorted_lst: | ||
| if dt_from <= i["datetime"] <= dt_to: | ||
| if (i["datetime"] - last_dt) > gap_interval: | ||
| missing.append({"from": last_dt, "to": i["datetime"]}) | ||
| if i.get("value_kWh", 1) > 0: | ||
| if oldest_dt is None or i["datetime"] < oldest_dt: | ||
| oldest_dt = i["datetime"] | ||
| if newest_dt is None or i["datetime"] > newest_dt: | ||
| newest_dt = i["datetime"] | ||
| if i["datetime"] != last_dt: # remove duplicates | ||
| new_lst.append(i) | ||
| last_dt = i["datetime"] | ||
| if dt_to > last_dt: | ||
| missing.append({"from": last_dt, "to": dt_to}) | ||
| _LOGGER.debug("found data from %s to %s", oldest_dt, newest_dt) | ||
| else: | ||
| missing.append({"from": dt_from, "to": dt_to}) | ||
| return new_lst, missing | ||
| def extend_by_key(old_lst, new_lst, key): | ||
| """Extend a list of dicts by key.""" | ||
| lst = deepcopy(old_lst) | ||
| temp_list = [] | ||
| for new_element in new_lst: | ||
| for old_element in lst: | ||
| if new_element[key] == old_element[key]: | ||
| for i in old_element: | ||
| old_element[i] = new_element[i] | ||
| break | ||
| else: | ||
| temp_list.append(new_element) | ||
| lst.extend(temp_list) | ||
| return lst | ||
| def extend_and_filter(old_lst, new_lst, key, dt_from, dt_to): | ||
| data = extend_by_key(old_lst, new_lst, key) | ||
| data, _ = extract_dt_ranges( | ||
| data, | ||
| dt_from, | ||
| dt_to, | ||
| gap_interval=timedelta(days=365), # trick | ||
| ) | ||
| return data | ||
| def get_by_key(lst, key, value): | ||
| """Obtain an element of a list of dicts by key=value.""" | ||
| for i in lst: | ||
| if i[key] == value: | ||
| return i | ||
| return None | ||
| def get_pvpc_tariff(a_datetime): | ||
| """Evals the PVPC tariff for a given datetime.""" | ||
| hdays = holidays.country_holidays("ES") | ||
| hour = a_datetime.hour | ||
| weekday = a_datetime.weekday() | ||
| if weekday in WEEKDAYS_P3 or a_datetime.date() in hdays: | ||
| return "p3" | ||
| elif hour in HOURS_P1: | ||
| return "p1" | ||
| elif hour in HOURS_P2: | ||
| return "p2" | ||
| else: | ||
| return "p3" | ||
| def serialize_dict(data: dict) -> dict: | ||
| """Serialize dicts as json.""" | ||
| class DateTimeEncoder(JSONEncoder): | ||
| """Replace datetime objects with ISO strings.""" | ||
| def default(self, o): | ||
| if isinstance(o, (date, datetime)): | ||
| return o.isoformat() | ||
| return json.loads(json.dumps(data, cls=DateTimeEncoder)) | ||
| def deserialize_dict(serialized_dict: dict) -> dict: | ||
| """Deserializes a json replacing ISOTIME strings into datetime.""" | ||
| def datetime_parser(json_dict): | ||
| """Parse JSON while converting ISO strings into datetime objects.""" | ||
| for key, value in json_dict.items(): | ||
| if "date" in key: | ||
| with contextlib.suppress(Exception): | ||
| json_dict[key] = datetime.fromisoformat(value) | ||
| return json_dict | ||
| return json.loads(json.dumps(serialized_dict), object_hook=datetime_parser) | ||
| def percentile(N, percent, key=lambda x: x): | ||
| """Find the percentile of a list of values.""" | ||
| if not N: | ||
| return None | ||
| k = (len(N) - 1) * percent | ||
| f = math.floor(k) | ||
| c = math.ceil(k) | ||
| if f == c: | ||
| return key(N[int(k)]) | ||
| d0 = key(N[int(f)]) * (c - k) | ||
| d1 = key(N[int(c)]) * (k - f) | ||
| return d0 + d1 |
| """Tests for DatadisConnector (offline).""" | ||
| import datetime | ||
| import datetime | ||
| from unittest.mock import patch, AsyncMock, MagicMock | ||
| from ..connectors.datadis import DatadisConnector | ||
| from ..connectors.datadis import DatadisConnector | ||
| MOCK_USERNAME = "USERNAME" | ||
| MOCK_PASSWORD = "PASSWORD" | ||
| SUPPLIES_RESPONSE = [ | ||
| { | ||
| "cups": "ESXXXXXXXXXXXXXXXXTEST", | ||
| "validDateFrom": "2022/03/09", | ||
| "validDateTo": "2022/10/28", | ||
| "address": "-", | ||
| "postalCode": "-", | ||
| "province": "-", | ||
| "municipality": "-", | ||
| "distributor": "-", | ||
| "pointType": 5, | ||
| "distributorCode": "2", | ||
| } | ||
| ] | ||
| CONTRACTS_RESPONSE = [ | ||
| { | ||
| "startDate": "2022/03/09", | ||
| "endDate": "2022/10/28", | ||
| "marketer": "MARKETER", | ||
| "distributorCode": "2", | ||
| "contractedPowerkW": [4.4, 4.4], | ||
| } | ||
| ] | ||
| CONSUMPTIONS_RESPONSE = [ | ||
| { | ||
| "date": "2022/10/22", | ||
| "time": "01:00", | ||
| "consumptionKWh": 0.203, | ||
| "surplusEnergyKWh": 0, | ||
| "obtainMethod": "Real", | ||
| }, | ||
| { | ||
| "date": "2022/10/22", | ||
| "time": "02:00", | ||
| "consumptionKWh": 0.163, | ||
| "surplusEnergyKWh": 0, | ||
| "obtainMethod": "Real", | ||
| }, | ||
| ] | ||
| MAXIMETER_RESPONSE = [ | ||
| { | ||
| "date": "2022/03/10", | ||
| "time": "14:15", | ||
| "maxPower": 2.436, | ||
| }, | ||
| { | ||
| "date": "2022/03/14", | ||
| "time": "13:15", | ||
| "maxPower": 3.008, | ||
| }, | ||
| { | ||
| "date": "2022/03/27", | ||
| "time": "10:30", | ||
| "maxPower": 3.288, | ||
| }, | ||
| ] | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_supplies(mock_token, mock_get, snapshot): | ||
| """Test a successful 'get_supplies' query (syrupy snapshot).""" | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=SUPPLIES_RESPONSE) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| assert connector.get_supplies() == snapshot | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_contract_detail(mock_token, mock_get, snapshot): | ||
| """Test a successful 'get_contract_detail' query (syrupy snapshot).""" | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=CONTRACTS_RESPONSE) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| assert connector.get_contract_detail("ESXXXXXXXXXXXXXXXXTEST", "2") == snapshot | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_consumption_data(mock_token, mock_get, snapshot): | ||
| """Test a successful 'get_consumption_data' query (syrupy snapshot).""" | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=CONSUMPTIONS_RESPONSE) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| assert connector.get_consumption_data( | ||
| "ESXXXXXXXXXXXXXXXXTEST", | ||
| "2", | ||
| datetime.datetime(2022, 10, 22, 0, 0, 0), | ||
| datetime.datetime(2022, 10, 22, 2, 0, 0), | ||
| "0", | ||
| 5, | ||
| ) == snapshot | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_max_power(mock_token, mock_get, snapshot): | ||
| """Test a successful 'get_max_power' query (syrupy snapshot).""" | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=MAXIMETER_RESPONSE) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| assert connector.get_max_power( | ||
| "ESXXXXXXXXXXXXXXXXTEST", | ||
| "2", | ||
| datetime.datetime(2022, 3, 1, 0, 0, 0), | ||
| datetime.datetime(2022, 4, 1, 0, 0, 0), | ||
| None, | ||
| ) == snapshot | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_supplies_empty_response(mock_token, mock_get, snapshot): | ||
| """Test get_supplies with empty response (syrupy snapshot).""" | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=[]) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| assert connector.get_supplies() == snapshot | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_supplies_malformed_response(mock_token, mock_get, snapshot): | ||
| """Test get_supplies with malformed response (missing required fields, syrupy snapshot).""" | ||
| malformed = [{"validDateFrom": "2022/03/09"}] # missing 'cups', etc. | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=malformed) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| assert connector.get_supplies() == snapshot | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_supplies_partial_response(mock_token, mock_get, snapshot): | ||
| """Test get_supplies with partial valid/invalid response (syrupy snapshot).""" | ||
| partial = [ | ||
| SUPPLIES_RESPONSE[0], | ||
| {"validDateFrom": "2022/03/09"}, # invalid | ||
| ] | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=partial) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| assert connector.get_supplies() == snapshot | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_consumption_data_cache(mock_token, mock_get, snapshot): | ||
| """Test get_consumption_data uses cache on second call (should not call HTTP again, syrupy snapshot).""" | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=CONSUMPTIONS_RESPONSE) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| # First call populates cache | ||
| assert connector.get_consumption_data( | ||
| "ESXXXXXXXXXXXXXXXXTEST", | ||
| "2", | ||
| datetime.datetime(2022, 10, 22, 0, 0, 0), | ||
| datetime.datetime(2022, 10, 22, 2, 0, 0), | ||
| "0", | ||
| 5, | ||
| ) == snapshot | ||
| # Second call should use cache, not call HTTP again | ||
| mock_get.reset_mock() | ||
| assert connector.get_consumption_data( | ||
| "ESXXXXXXXXXXXXXXXXTEST", | ||
| "2", | ||
| datetime.datetime(2022, 10, 22, 0, 0, 0), | ||
| datetime.datetime(2022, 10, 22, 2, 0, 0), | ||
| "0", | ||
| 5, | ||
| ) == snapshot | ||
| mock_get.assert_not_called() | ||
| @patch("aiohttp.ClientSession.get") | ||
| @patch.object(DatadisConnector, "_async_get_token", new_callable=AsyncMock, return_value=True) | ||
| def test_get_supplies_optional_fields_none(mock_token, mock_get, snapshot): | ||
| """Test get_supplies with optional fields as None (syrupy snapshot).""" | ||
| response = [ | ||
| { | ||
| "cups": "ESXXXXXXXXXXXXXXXXTEST", | ||
| "validDateFrom": "2022/03/09", | ||
| "validDateTo": "2022/10/28", | ||
| "address": None, | ||
| "postalCode": None, | ||
| "province": None, | ||
| "municipality": None, | ||
| "distributor": None, | ||
| "pointType": 5, | ||
| "distributorCode": "2", | ||
| } | ||
| ] | ||
| mock_response = MagicMock() | ||
| mock_response.status = 200 | ||
| mock_response.text = AsyncMock(return_value="text") | ||
| mock_response.json = AsyncMock(return_value=response) | ||
| mock_get.return_value.__aenter__.return_value = mock_response | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| assert connector.get_supplies() == snapshot |
| """A collection of tests for e-data processors""" | ||
| import datetime as dt | ||
| import json | ||
| import pathlib | ||
| import typing | ||
| from collections.abc import Iterable | ||
| import pytest | ||
| from ..definitions import PricingData, PricingRules | ||
| from ..processors import utils | ||
| from ..processors.base import Processor | ||
| from ..processors.billing import BillingProcessor | ||
| from ..processors.consumption import ConsumptionProcessor | ||
| from ..processors.maximeter import MaximeterProcessor | ||
| TESTS_DIR = str(pathlib.Path(__file__).parent.resolve()) | ||
| TEST_GOOD_INPUT = TESTS_DIR + "/assets/processors/edata.storage_TEST" | ||
| def _compare_processor_output( | ||
| source_filepath: str, | ||
| processor_class: Processor, | ||
| key: str, | ||
| snapshot, | ||
| ): | ||
| with open(source_filepath, encoding="utf-8") as original_file: | ||
| data = utils.deserialize_dict(json.load(original_file)) | ||
| if key == "consumptions": | ||
| processor = processor_class({"consumptions": data[key]}) | ||
| else: | ||
| processor = processor_class(data[key]) | ||
| assert utils.serialize_dict(processor.output) == snapshot | ||
| @pytest.mark.parametrize( | ||
| "processor, key", | ||
| [(ConsumptionProcessor, "consumptions"), (MaximeterProcessor, "maximeter")], | ||
| ) | ||
| def test_processor(processor: Processor, key: str, snapshot) -> None: | ||
| """Tests all processors but billing (syrupy snapshot)""" | ||
| _compare_processor_output( | ||
| TEST_GOOD_INPUT, | ||
| processor, | ||
| key, | ||
| snapshot, | ||
| ) | ||
| @pytest.mark.parametrize( | ||
| "_id, rules, prices", | ||
| [ | ||
| ( | ||
| "custom_prices", | ||
| PricingRules( | ||
| p1_kw_year_eur=30.67266, | ||
| p2_kw_year_eur=1.4243591, | ||
| meter_month_eur=0.81, | ||
| market_kw_year_eur=3.113, | ||
| electricity_tax=1.0511300560, | ||
| iva_tax=1.1, | ||
| p1_kwh_eur=None, | ||
| p2_kwh_eur=None, | ||
| p3_kwh_eur=None, | ||
| ), | ||
| [ | ||
| PricingData( | ||
| datetime=dt.datetime(2022, 10, 22, x, 0, 0), | ||
| value_eur_kWh=1, | ||
| delta_h=1, | ||
| ) | ||
| for x in range(0, 24) | ||
| ], | ||
| ), | ||
| ( | ||
| "constant_prices", | ||
| PricingRules( | ||
| p1_kw_year_eur=30.67266, | ||
| p2_kw_year_eur=1.4243591, | ||
| meter_month_eur=0.81, | ||
| market_kw_year_eur=3.113, | ||
| electricity_tax=1.0511300560, | ||
| iva_tax=1.1, | ||
| p1_kwh_eur=1, | ||
| p2_kwh_eur=1, | ||
| p3_kwh_eur=1, | ||
| ), | ||
| None, | ||
| ), | ||
| ], | ||
| ) | ||
| def test_processor_billing( | ||
| _id: str, rules: PricingRules, prices: typing.Optional[Iterable[PricingData]], snapshot | ||
| ): | ||
| """Tests billing processor (syrupy snapshot)""" | ||
| with open(TEST_GOOD_INPUT, "r", encoding="utf-8") as original_file: | ||
| data = utils.deserialize_dict(json.load(original_file)) | ||
| processor = BillingProcessor( | ||
| { | ||
| "consumptions": data["consumptions"], | ||
| "contracts": data["contracts"], | ||
| "prices": prices, | ||
| "rules": rules, | ||
| } | ||
| ) | ||
| assert utils.serialize_dict(processor.output) == snapshot |
| """Tests for REData (online)""" | ||
| from datetime import datetime, timedelta | ||
| from ..connectors.redata import REDataConnector | ||
| def test_get_realtime_prices(): | ||
| """Test a successful 'get_realtime_prices' query""" | ||
| connector = REDataConnector() | ||
| yesterday = datetime.now().replace(hour=0, minute=0, second=0) - timedelta(days=1) | ||
| response = connector.get_realtime_prices( | ||
| yesterday, yesterday + timedelta(days=1) - timedelta(minutes=1), False | ||
| ) | ||
| assert len(response) == 24 |
+249
-260
@@ -9,3 +9,5 @@ """Datadis API connector. | ||
| import asyncio | ||
| import contextlib | ||
| from datetime import datetime, timedelta | ||
| import hashlib | ||
@@ -15,16 +17,14 @@ import logging | ||
| import tempfile | ||
| from datetime import datetime, timedelta | ||
| import diskcache | ||
| import aiohttp | ||
| import diskcache | ||
| from dateutil.relativedelta import relativedelta | ||
| from edata import utils | ||
| from edata.models import Consumption, Contract, MaxPower, Supply | ||
| import aiohttp | ||
| import asyncio | ||
| from ..definitions import ConsumptionData, ContractData, MaxPowerData, SupplyData | ||
| from ..processors import utils | ||
| _LOGGER = logging.getLogger(__name__) | ||
| # Request timeout constant | ||
| REQUESTS_TIMEOUT = 30 | ||
| # Token-related constants | ||
@@ -74,2 +74,3 @@ URL_TOKEN = "https://datadis.es/nikola-auth/tokens/login" | ||
| # Cache-related constants | ||
@@ -79,2 +80,11 @@ RECENT_CACHE_SUBDIR = "cache" | ||
| def migrate_storage(storage_dir): | ||
| """Migrate storage from older versions.""" | ||
| with contextlib.suppress(FileNotFoundError): | ||
| os.remove(os.path.join(storage_dir, "edata_recent_queries.json")) | ||
| os.remove(os.path.join(storage_dir, "edata_recent_queries_cache.json")) | ||
| class DatadisConnector: | ||
@@ -90,5 +100,2 @@ """A Datadis private API connector.""" | ||
| ) -> None: | ||
| """DatadisConnector constructor.""" | ||
| # initialize some things | ||
| self._usr = username | ||
@@ -98,5 +105,8 @@ self._pwd = password | ||
| self._smart_fetch = enable_smart_fetch | ||
| self._recent_queries = {} | ||
| self._recent_cache = {} | ||
| self._warned_queries = [] | ||
| if storage_path is not None: | ||
| self._recent_cache_dir = os.path.join(storage_path, RECENT_CACHE_SUBDIR) | ||
| migrate_storage(storage_path) | ||
| else: | ||
@@ -106,77 +116,60 @@ self._recent_cache_dir = os.path.join( | ||
| ) | ||
| os.makedirs(self._recent_cache_dir, exist_ok=True) | ||
| self._cache = diskcache.Cache(self._recent_cache_dir) | ||
| # Initialize diskcache for persistent caching | ||
| self._cache = diskcache.Cache( | ||
| self._recent_cache_dir, | ||
| size_limit=100 * 1024 * 1024, # 100MB limit | ||
| eviction_policy="least-recently-used", | ||
| ) | ||
| def _update_recent_queries(self, query: str, data: dict | None = None) -> None: | ||
| """Cache a successful query to avoid exceeding query limits (diskcache).""" | ||
| hash_query = hashlib.md5(query.encode()).hexdigest() | ||
| try: | ||
| self._cache.set(hash_query, data, expire=QUERY_LIMIT.total_seconds()) | ||
| _LOGGER.info("Updating cache item '%s'", hash_query) | ||
| except Exception as e: | ||
| _LOGGER.warning("Unknown error while updating cache: %s", e) | ||
| async def login(self): | ||
| """Test to login with provided credentials.""" | ||
| return await self._get_token() | ||
| def _is_recent_query(self, query: str) -> bool: | ||
| """Check if a query has been done recently to avoid exceeding query limits (diskcache).""" | ||
| hash_query = hashlib.md5(query.encode()).hexdigest() | ||
| return hash_query in self._cache | ||
| async def get_supplies(self, authorized_nif: str | None = None) -> list[Supply]: | ||
| """Datadis 'get_supplies' query (async version).""" | ||
| def _get_cache_for_query(self, query: str) -> dict | None: | ||
| """Return cached response for a query (diskcache).""" | ||
| hash_query = hashlib.md5(query.encode()).hexdigest() | ||
| try: | ||
| return self._cache.get(hash_query, default=None) | ||
| except Exception: | ||
| return None | ||
| data = {} | ||
| # If authorized_nif is provided, we have to include it as parameter | ||
| if authorized_nif is not None: | ||
| data["authorizedNif"] = authorized_nif | ||
| async def _async_get_token(self): | ||
| """Private async method that fetches a new token if needed.""" | ||
| _LOGGER.info("No token found, fetching a new one") | ||
| is_valid_token = False | ||
| timeout = aiohttp.ClientTimeout(total=TIMEOUT) | ||
| async with aiohttp.ClientSession(timeout=timeout) as session: | ||
| try: | ||
| async with session.post( | ||
| URL_TOKEN, | ||
| data={ | ||
| TOKEN_USERNAME: self._usr.encode("utf-8"), | ||
| TOKEN_PASSWD: self._pwd.encode("utf-8"), | ||
| }, | ||
| ) as response: | ||
| text = await response.text() | ||
| if response.status == 200: | ||
| self._token["encoded"] = text | ||
| self._token["headers"] = {"Authorization": "Bearer " + self._token["encoded"]} | ||
| is_valid_token = True | ||
| else: | ||
| _LOGGER.error("Unknown error while retrieving token, got %s", text) | ||
| except Exception as e: | ||
| _LOGGER.error("Exception while retrieving token: %s", e) | ||
| return is_valid_token | ||
| # Request the resource using get method | ||
| response = await self._get( | ||
| URL_GET_SUPPLIES, request_data=data, ignore_recent_queries=True | ||
| ) | ||
| # Response is a list of serialized supplies. | ||
| # We will iter through them to transform them into Supply objects | ||
| supplies = [] | ||
| # Build tomorrow Y/m/d string since we will use it as the 'date_end' of | ||
| # active supplies | ||
| tomorrow_str = (datetime.today() + timedelta(days=1)).strftime("%Y/%m/%d") | ||
| for i in response: | ||
| # check data integrity (maybe this can be supressed if datadis proves to be reliable) | ||
| if all(k in i for k in GET_SUPPLIES_MANDATORY_FIELDS): | ||
| supplies.append( | ||
| Supply( | ||
| cups=i["cups"], # the supply identifier | ||
| date_start=datetime.strptime( | ||
| ( | ||
| i["validDateFrom"] | ||
| if i["validDateFrom"] != "" | ||
| else "1970/01/01" | ||
| ), | ||
| "%Y/%m/%d", | ||
| ), # start date of the supply. 1970/01/01 if unset. | ||
| date_end=datetime.strptime( | ||
| ( | ||
| i["validDateTo"] | ||
| if i["validDateTo"] != "" | ||
| else tomorrow_str | ||
| ), | ||
| "%Y/%m/%d", | ||
| ), # end date of the supply, tomorrow if unset | ||
| # the following parameters are not crucial, so they can be none | ||
| address=i.get("address", None), | ||
| postal_code=i.get("postalCode", None), | ||
| province=i.get("province", None), | ||
| municipality=i.get("municipality", None), | ||
| distributor=i.get("distributor", None), | ||
| # these two are mandatory, we will use them to fetch contracts data | ||
| point_type=i["pointType"], | ||
| distributor_code=i["distributorCode"], | ||
| ) | ||
| ) | ||
| else: | ||
| _LOGGER.warning( | ||
| "Weird data structure while fetching supplies data, got %s", | ||
| response, | ||
| ) | ||
| return supplies | ||
| def login(self): | ||
| """Test to login with provided credentials (sync wrapper).""" | ||
| return asyncio.run(self._async_get_token()) | ||
| async def _get( | ||
| async def _async_get( | ||
| self, | ||
@@ -189,4 +182,3 @@ url: str, | ||
| ): | ||
| """Get request for Datadis API (async version).""" | ||
| """Async get request for Datadis API.""" | ||
| if request_data is None: | ||
@@ -197,117 +189,60 @@ data = {} | ||
| # build get parameters | ||
| params = "?" if len(data) > 0 else "" | ||
| for param in data: | ||
| key = param | ||
| value = data[param] | ||
| params = params + f"{key}={value}&" | ||
| anonym_params = "?" if len(data) > 0 else "" | ||
| # build anonymized params for logging | ||
| for anonym_param in data: | ||
| key = anonym_param | ||
| if key == "cups": | ||
| value = "xxxx" + data[anonym_param][-5:] | ||
| elif key == "authorizedNif": | ||
| value = "xxxx" | ||
| else: | ||
| value = data[anonym_param] | ||
| anonym_params = anonym_params + f"{key}={value}&" | ||
| # Check diskcache first (unless ignoring cache) | ||
| if not ignore_recent_queries: | ||
| cache_data = { | ||
| "url": url, | ||
| "request_data": request_data, | ||
| "refresh_token": refresh_token, | ||
| "is_retry": is_retry, | ||
| } | ||
| cache_key = hashlib.sha256(str(cache_data).encode()).hexdigest() | ||
| try: | ||
| # Run cache get operation in thread to avoid blocking | ||
| cached_result = await asyncio.to_thread(self._cache.get, cache_key) | ||
| if cached_result is not None and isinstance( | ||
| cached_result, (list, dict) | ||
| ): | ||
| _LOGGER.info( | ||
| "Returning cached response for '%s'", url + anonym_params | ||
| ) | ||
| return cached_result | ||
| except Exception as e: | ||
| _LOGGER.warning("Error reading cache: %s", e) | ||
| # refresh token if needed (recursive approach) | ||
| is_valid_token = False | ||
| response = [] | ||
| if refresh_token: | ||
| is_valid_token = await self._get_token() | ||
| is_valid_token = await self._async_get_token() | ||
| if is_valid_token or not refresh_token: | ||
| params = "?" if len(data) > 0 else "" | ||
| for param in data: | ||
| key = param | ||
| value = data[param] | ||
| params = params + f"{key}={value}&" | ||
| anonym_params = "?" if len(data) > 0 else "" | ||
| for anonym_param in data: | ||
| key = anonym_param | ||
| if key == "cups": | ||
| value = "xxxx" + str(data[anonym_param])[-5:] | ||
| elif key == "authorizedNif": | ||
| value = "xxxx" | ||
| else: | ||
| value = data[anonym_param] | ||
| anonym_params = anonym_params + f"{key}={value}&" | ||
| # run the query | ||
| timeout = aiohttp.ClientTimeout(total=REQUESTS_TIMEOUT) | ||
| async with aiohttp.ClientSession(timeout=timeout) as session: | ||
| try: | ||
| _LOGGER.info("GET %s", url + anonym_params) | ||
| headers = {"Accept-Encoding": "identity"} | ||
| if not ignore_recent_queries and self._is_recent_query(url + params): | ||
| _cache = self._get_cache_for_query(url + params) | ||
| if _cache is not None: | ||
| _LOGGER.info( | ||
| "Returning cached response for '%s'", url + anonym_params | ||
| ) | ||
| return _cache | ||
| return [] | ||
| # Ensure we have a token | ||
| if not self._token.get("encoded"): | ||
| await self._get_token() | ||
| headers["Authorization"] = f"Bearer {self._token['encoded']}" | ||
| async with session.get(url + params, headers=headers) as reply: | ||
| # eval response | ||
| try: | ||
| _LOGGER.info("GET %s", url + anonym_params) | ||
| headers = {"Accept-Encoding": "identity"} | ||
| if self._token.get("headers"): | ||
| headers.update(self._token["headers"]) | ||
| timeout = aiohttp.ClientTimeout(total=TIMEOUT) | ||
| async with aiohttp.ClientSession(timeout=timeout) as session: | ||
| async with session.get( | ||
| url + params, | ||
| headers=headers, | ||
| ) as reply: | ||
| text = await reply.text() | ||
| if reply.status == 200: | ||
| # we're here if reply seems valid | ||
| _LOGGER.info("Got 200 OK") | ||
| try: | ||
| response_json = await reply.json(content_type=None) | ||
| if response_json: | ||
| response = response_json | ||
| # Store in diskcache with 24h TTL | ||
| if not ignore_recent_queries and isinstance( | ||
| response, (list, dict) | ||
| ): | ||
| try: | ||
| cache_data = { | ||
| "url": url, | ||
| "request_data": request_data, | ||
| "refresh_token": refresh_token, | ||
| "is_retry": is_retry, | ||
| } | ||
| cache_key = hashlib.sha256( | ||
| str(cache_data).encode() | ||
| ).hexdigest() | ||
| ttl_seconds = int( | ||
| QUERY_LIMIT.total_seconds() | ||
| ) | ||
| # Run cache set operation in thread to avoid blocking | ||
| await asyncio.to_thread( | ||
| self._cache.set, | ||
| cache_key, | ||
| response, | ||
| expire=ttl_seconds, | ||
| ) | ||
| _LOGGER.info( | ||
| "Cached response for %s with TTL %d seconds", | ||
| url, | ||
| ttl_seconds, | ||
| ) | ||
| except Exception as e: | ||
| _LOGGER.warning( | ||
| "Error storing in cache: %s", e | ||
| ) | ||
| json_data = await reply.json() | ||
| if json_data: | ||
| response = json_data | ||
| if not ignore_recent_queries: | ||
| self._update_recent_queries(url + params, response) | ||
| else: | ||
| # this mostly happens when datadis provides an empty response | ||
| _LOGGER.info("Got an empty response") | ||
| except Exception as e: | ||
| # Handle non-JSON responses | ||
| _LOGGER.info("Got an empty or non-JSON response") | ||
| _LOGGER.exception(e) | ||
| if not ignore_recent_queries: | ||
| self._update_recent_queries(url + params) | ||
| except Exception: | ||
| _LOGGER.warning("Failed to parse JSON response") | ||
| elif reply.status == 401 and not refresh_token: | ||
| # we're here if we were unauthorized so we will refresh the token | ||
| response = await self._get( | ||
| response = await self._async_get( | ||
| url, | ||
@@ -319,10 +254,10 @@ request_data=data, | ||
| elif reply.status == 429: | ||
| # we're here if we exceeded datadis API rates (24h) | ||
| _LOGGER.warning( | ||
| "Got status code '%s' with message '%s'", | ||
| reply.status, | ||
| await reply.text(), | ||
| text, | ||
| ) | ||
| if not ignore_recent_queries: | ||
| self._update_recent_queries(url + params) | ||
| elif is_retry: | ||
| # otherwise, if this was a retried request... warn the user | ||
| if (url + params) not in self._warned_queries: | ||
@@ -332,10 +267,11 @@ _LOGGER.warning( | ||
| reply.status, | ||
| await reply.text(), | ||
| text, | ||
| "Query temporary disabled", | ||
| "Future 500 code errors for this query will be silenced until restart", | ||
| ) | ||
| if not ignore_recent_queries: | ||
| self._update_recent_queries(url + params) | ||
| self._warned_queries.append(url + params) | ||
| else: | ||
| # finally, retry since an unexpected error took place (mostly 500 errors - server fault) | ||
| response = await self._get( | ||
| response = await self._async_get( | ||
| url, | ||
@@ -346,21 +282,68 @@ request_data, | ||
| ) | ||
| except asyncio.TimeoutError: | ||
| _LOGGER.warning("Timeout at %s", url + anonym_params) | ||
| return [] | ||
| except Exception as e: | ||
| _LOGGER.error( | ||
| "Error during async request to %s: %s", url + anonym_params, e | ||
| except asyncio.TimeoutError: | ||
| _LOGGER.warning("Timeout at %s", url + anonym_params) | ||
| return [] | ||
| except Exception as e: | ||
| _LOGGER.warning("Exception at %s: %s", url + anonym_params, e) | ||
| return [] | ||
| return response | ||
| async def async_get_supplies(self, authorized_nif: str | None = None): | ||
| data = {} | ||
| if authorized_nif is not None: | ||
| data["authorizedNif"] = authorized_nif | ||
| response = await self._async_get( | ||
| URL_GET_SUPPLIES, request_data=data, ignore_recent_queries=True | ||
| ) | ||
| supplies = [] | ||
| tomorrow_str = (datetime.today() + timedelta(days=1)).strftime("%Y/%m/%d") | ||
| for i in response: | ||
| if all(k in i for k in GET_SUPPLIES_MANDATORY_FIELDS): | ||
| supplies.append( | ||
| SupplyData( | ||
| cups=i["cups"], | ||
| date_start=datetime.strptime( | ||
| ( | ||
| i["validDateFrom"] | ||
| if i["validDateFrom"] != "" | ||
| else "1970/01/01" | ||
| ), | ||
| "%Y/%m/%d", | ||
| ), | ||
| date_end=datetime.strptime( | ||
| ( | ||
| i["validDateTo"] | ||
| if i["validDateTo"] != "" | ||
| else tomorrow_str | ||
| ), | ||
| "%Y/%m/%d", | ||
| ), | ||
| address=i.get("address", None), | ||
| postal_code=i.get("postalCode", None), | ||
| province=i.get("province", None), | ||
| municipality=i.get("municipality", None), | ||
| distributor=i.get("distributor", None), | ||
| pointType=i["pointType"], | ||
| distributorCode=i["distributorCode"], | ||
| ) | ||
| return [] | ||
| ) | ||
| else: | ||
| _LOGGER.warning( | ||
| "Weird data structure while fetching supplies data, got %s", | ||
| response, | ||
| ) | ||
| return supplies | ||
| return response | ||
| def get_supplies(self, authorized_nif: str | None = None): | ||
| """Datadis 'get_supplies' query (sync wrapper).""" | ||
| return asyncio.run(self.async_get_supplies(authorized_nif=authorized_nif)) | ||
| async def get_contract_detail( | ||
| async def async_get_contract_detail( | ||
| self, cups: str, distributor_code: str, authorized_nif: str | None = None | ||
| ) -> list[Contract]: | ||
| """Datadis get_contract_detail query (async version).""" | ||
| ): | ||
| data = {"cups": cups, "distributorCode": distributor_code} | ||
| if authorized_nif is not None: | ||
| data["authorizedNif"] = authorized_nif | ||
| response = await self._get( | ||
| response = await self._async_get( | ||
| URL_GET_CONTRACT_DETAIL, request_data=data, ignore_recent_queries=True | ||
@@ -373,3 +356,3 @@ ) | ||
| contracts.append( | ||
| Contract( | ||
| ContractData( | ||
| date_start=datetime.strptime( | ||
@@ -384,3 +367,3 @@ i["startDate"] if i["startDate"] != "" else "1970/01/01", | ||
| marketer=i["marketer"], | ||
| distributor_code=distributor_code, | ||
| distributorCode=distributor_code, | ||
| power_p1=( | ||
@@ -405,3 +388,10 @@ i["contractedPowerkW"][0] | ||
| async def get_consumption_data( | ||
| def get_contract_detail( | ||
| self, cups: str, distributor_code: str, authorized_nif: str | None = None | ||
| ): | ||
| """Datadis get_contract_detail query (sync wrapper).""" | ||
| return asyncio.run(self.async_get_contract_detail(cups, distributor_code, authorized_nif)) | ||
| async def async_get_consumption_data( | ||
| self, | ||
@@ -416,8 +406,6 @@ cups: str, | ||
| is_smart_fetch: bool = False, | ||
| ) -> list[Consumption]: | ||
| """Datadis get_consumption_data query (async version).""" | ||
| ): | ||
| if self._smart_fetch and not is_smart_fetch: | ||
| _start = start_date | ||
| consumptions_dicts = [] | ||
| consumptions = [] | ||
| while _start < end_date: | ||
@@ -427,3 +415,3 @@ _end = min( | ||
| ) | ||
| batch_consumptions = await self.get_consumption_data( | ||
| sub_consumptions = await self.async_get_consumption_data( | ||
| cups, | ||
@@ -438,12 +426,9 @@ distributor_code, | ||
| ) | ||
| # Convert to dicts for extend_by_key function | ||
| batch_dicts = [c.model_dump() for c in batch_consumptions] | ||
| consumptions_dicts = utils.extend_by_key( | ||
| consumptions_dicts, | ||
| batch_dicts, | ||
| consumptions = utils.extend_by_key( | ||
| consumptions, | ||
| sub_consumptions, | ||
| "datetime", | ||
| ) | ||
| _start = _end | ||
| # Convert back to Pydantic models | ||
| return [Consumption(**c) for c in consumptions_dicts] | ||
| return consumptions | ||
@@ -461,3 +446,3 @@ data = { | ||
| response = await self._get(URL_GET_CONSUMPTION_DATA, request_data=data) | ||
| response = await self._async_get(URL_GET_CONSUMPTION_DATA, request_data=data) | ||
@@ -478,7 +463,7 @@ consumptions = [] | ||
| consumptions.append( | ||
| Consumption( | ||
| ConsumptionData( | ||
| datetime=date_as_dt, | ||
| delta_h=1, | ||
| value_kwh=i["consumptionKWh"], | ||
| surplus_kwh=_surplus, | ||
| value_kWh=i["consumptionKWh"], | ||
| surplus_kWh=_surplus, | ||
| real=i["obtainMethod"] == "Real", | ||
@@ -494,3 +479,3 @@ ) | ||
| async def get_max_power( | ||
| def get_consumption_data( | ||
| self, | ||
@@ -501,6 +486,28 @@ cups: str, | ||
| end_date: datetime, | ||
| measurement_type: str, | ||
| point_type: int, | ||
| authorized_nif: str | None = None, | ||
| ) -> list[MaxPower]: | ||
| """Datadis get_max_power query (async version).""" | ||
| is_smart_fetch: bool = False, | ||
| ): | ||
| """Datadis get_consumption_data query (sync wrapper).""" | ||
| return asyncio.run(self.async_get_consumption_data( | ||
| cups, | ||
| distributor_code, | ||
| start_date, | ||
| end_date, | ||
| measurement_type, | ||
| point_type, | ||
| authorized_nif, | ||
| is_smart_fetch, | ||
| )) | ||
| async def async_get_max_power( | ||
| self, | ||
| cups: str, | ||
| distributor_code: str, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| authorized_nif: str | None = None, | ||
| ): | ||
| data = { | ||
@@ -514,3 +521,3 @@ "cups": cups, | ||
| data["authorizedNif"] = authorized_nif | ||
| response = await self._get(URL_GET_MAX_POWER, request_data=data) | ||
| response = await self._async_get(URL_GET_MAX_POWER, request_data=data) | ||
| maxpower_values = [] | ||
@@ -520,7 +527,7 @@ for i in response: | ||
| maxpower_values.append( | ||
| MaxPower( | ||
| MaxPowerData( | ||
| datetime=datetime.strptime( | ||
| f"{i['date']} {i['time']}", "%Y/%m/%d %H:%M" | ||
| ), | ||
| value_kw=i["maxPower"], | ||
| value_kW=i["maxPower"], | ||
| ) | ||
@@ -535,35 +542,17 @@ ) | ||
| async def _get_token(self): | ||
| """Private method that fetches a new token if needed (async version).""" | ||
| _LOGGER.info("Fetching token for async requests") | ||
| is_valid_token = False | ||
| timeout = aiohttp.ClientTimeout(total=REQUESTS_TIMEOUT) | ||
| # Prepare data as URL-encoded string, same as sync version | ||
| form_data = { | ||
| TOKEN_USERNAME: self._usr, | ||
| TOKEN_PASSWD: self._pwd, | ||
| } | ||
| async with aiohttp.ClientSession(timeout=timeout) as session: | ||
| try: | ||
| async with session.post( | ||
| URL_TOKEN, | ||
| data=form_data, | ||
| headers={"Content-Type": "application/x-www-form-urlencoded"}, | ||
| ) as response: | ||
| if response.status == 200: | ||
| # store token encoded | ||
| self._token["encoded"] = await response.text() | ||
| is_valid_token = True | ||
| else: | ||
| _LOGGER.error( | ||
| "Unknown error while retrieving async token, got %s", | ||
| await response.text(), | ||
| ) | ||
| except Exception as e: | ||
| _LOGGER.error("Error during async token fetch: %s", e) | ||
| return is_valid_token | ||
| def get_max_power( | ||
| self, | ||
| cups: str, | ||
| distributor_code: str, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| authorized_nif: str | None = None, | ||
| ): | ||
| """Datadis get_max_power query (sync wrapper).""" | ||
| return asyncio.run(self.async_get_max_power( | ||
| cups, | ||
| distributor_code, | ||
| start_date, | ||
| end_date, | ||
| authorized_nif, | ||
| )) |
| """A REData API connector""" | ||
| import asyncio | ||
| import datetime as dt | ||
@@ -8,5 +7,6 @@ import logging | ||
| import aiohttp | ||
| import asyncio | ||
| from dateutil import parser | ||
| from edata.models.pricing import PricingData | ||
| from ..definitions import PricingData | ||
@@ -33,6 +33,6 @@ _LOGGER = logging.getLogger(__name__) | ||
| async def get_realtime_prices( | ||
| async def async_get_realtime_prices( | ||
| self, dt_from: dt.datetime, dt_to: dt.datetime, is_ceuta_melilla: bool = False | ||
| ) -> list: | ||
| """GET query to fetch realtime pvpc prices, historical data is limited to current month (async version)""" | ||
| """GET query to fetch realtime pvpc prices, historical data is limited to current month (async)""" | ||
| url = URL_REALTIME_PRICES.format( | ||
@@ -44,32 +44,26 @@ geo_id=8744 if is_ceuta_melilla else 8741, | ||
| data = [] | ||
| timeout = aiohttp.ClientTimeout(total=REQUESTS_TIMEOUT) | ||
| async with aiohttp.ClientSession(timeout=timeout) as session: | ||
| try: | ||
| async with session.get(url) as response: | ||
| if response.status == 200: | ||
| res_json = await response.json() | ||
| if res_json: | ||
| try: | ||
| res_list = res_json["included"][0]["attributes"][ | ||
| "values" | ||
| ] | ||
| except (IndexError, KeyError): | ||
| _LOGGER.error( | ||
| "%s returned a malformed response: %s ", | ||
| url, | ||
| await response.text(), | ||
| async with session.get(url) as res: | ||
| text = await res.text() | ||
| if res.status == 200: | ||
| try: | ||
| res_json = await res.json() | ||
| res_list = res_json["included"][0]["attributes"]["values"] | ||
| except (IndexError, KeyError): | ||
| _LOGGER.error( | ||
| "%s returned a malformed response: %s ", | ||
| url, | ||
| text, | ||
| ) | ||
| return data | ||
| for element in res_list: | ||
| data.append( | ||
| PricingData( | ||
| datetime=parser.parse(element["datetime"]).replace(tzinfo=None), | ||
| value_eur_kWh=element["value"] / 1000, | ||
| delta_h=1, | ||
| ) | ||
| return data | ||
| for element in res_list: | ||
| data.append( | ||
| PricingData( | ||
| datetime=parser.parse( | ||
| element["datetime"] | ||
| ).replace(tzinfo=None), | ||
| value_eur_kwh=element["value"] / 1000, | ||
| delta_h=1, | ||
| ) | ||
| ) | ||
| ) | ||
| else: | ||
@@ -79,14 +73,13 @@ _LOGGER.error( | ||
| url, | ||
| await response.text(), | ||
| response.status, | ||
| text, | ||
| res.status, | ||
| ) | ||
| except asyncio.TimeoutError: | ||
| _LOGGER.error("Timeout error when fetching data from %s", url) | ||
| except aiohttp.ClientError as e: | ||
| _LOGGER.error( | ||
| "HTTP client error when fetching data from %s: %s", url, e | ||
| ) | ||
| except Exception as e: | ||
| _LOGGER.error("Unexpected error when fetching data from %s: %s", url, e) | ||
| _LOGGER.error("Exception fetching realtime prices: %s", e) | ||
| return data | ||
| return data | ||
| def get_realtime_prices( | ||
| self, dt_from: dt.datetime, dt_to: dt.datetime, is_ceuta_melilla: bool = False | ||
| ) -> list: | ||
| """GET query to fetch realtime pvpc prices, historical data is limited to current month (sync wrapper)""" | ||
| return asyncio.run(self.async_get_realtime_prices(dt_from, dt_to, is_ceuta_melilla)) |
+30
-590
@@ -1,25 +0,20 @@ | ||
| """Integration tests for EdataHelper with service-based architecture.""" | ||
| """A collection of tests for e-data processors""" | ||
| from datetime import datetime | ||
| from unittest.mock import AsyncMock, Mock, patch | ||
| import json | ||
| import pathlib | ||
| import pytest | ||
| from freezegun import freeze_time | ||
| from edata.const import ATTRIBUTES | ||
| from edata.helpers import EdataHelper | ||
| from edata.models.consumption import Consumption | ||
| from edata.models.contract import Contract | ||
| from edata.models.maximeter import MaxPower | ||
| from edata.models.pricing import PricingRules | ||
| from edata.models.supply import Supply | ||
| from ..definitions import PricingRules | ||
| from ..helpers import EdataHelper | ||
| from ..processors import utils | ||
| # Test data constants | ||
| TEST_CUPS = "ES1234000000000001JN0F" | ||
| TEST_USERNAME = "testuser" | ||
| TEST_PASSWORD = "testpass" | ||
| TEST_NIF = "12345678Z" | ||
| AT_TIME = "2023-10-15" | ||
| AT_TIME = "2022-10-22" | ||
| TESTS_DIR = str(pathlib.Path(__file__).parent.resolve()) | ||
| TEST_GOOD_INPUT = TESTS_DIR + "/assets/helpers/edata.storage_TEST" | ||
| TEST_EXPECTATIONS_DATA = TESTS_DIR + "/assets/helpers/data.out" | ||
| TEST_EXPECTATIONS_ATTRIBUTES = ( | ||
| TESTS_DIR + f"/assets/helpers/attributes_at_{AT_TIME}.out" | ||
| ) | ||
| # Sample pricing rules for testing | ||
| PRICING_RULES_PVPC = PricingRules( | ||
@@ -32,582 +27,27 @@ p1_kw_year_eur=30.67266, | ||
| iva_tax=1.05, | ||
| p1_kwh_eur=None, # PVPC mode | ||
| p1_kwh_eur=None, | ||
| p2_kwh_eur=None, | ||
| p3_kwh_eur=None, | ||
| surplus_p1_kwh_eur=None, | ||
| surplus_p2_kwh_eur=None, | ||
| surplus_p3_kwh_eur=None, | ||
| energy_formula="electricity_tax * iva_tax * kwh_eur * kwh", | ||
| power_formula="electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| others_formula="iva_tax * meter_month_eur / 30 / 24", | ||
| surplus_formula="electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| main_formula="energy_term + power_term + others_term", | ||
| ) | ||
| PRICING_RULES_FIXED = PricingRules( | ||
| p1_kw_year_eur=30.67266, | ||
| p2_kw_year_eur=1.4243591, | ||
| meter_month_eur=0.81, | ||
| market_kw_year_eur=3.113, | ||
| electricity_tax=1.0511300560, | ||
| iva_tax=1.05, | ||
| p1_kwh_eur=0.12, # Fixed prices | ||
| p2_kwh_eur=0.10, | ||
| p3_kwh_eur=0.08, | ||
| surplus_p1_kwh_eur=0.05, | ||
| surplus_p2_kwh_eur=0.04, | ||
| surplus_p3_kwh_eur=0.03, | ||
| energy_formula="electricity_tax * iva_tax * kwh_eur * kwh", | ||
| power_formula="electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| others_formula="iva_tax * meter_month_eur / 30 / 24", | ||
| surplus_formula="electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| main_formula="energy_term + power_term + others_term", | ||
| ) | ||
| @freeze_time(AT_TIME) | ||
| def test_helper_offline(snapshot) -> None: | ||
| """Tests EdataHelper (syrupy snapshot)""" | ||
| with open(TEST_GOOD_INPUT, "r", encoding="utf-8") as original_file: | ||
| data = utils.deserialize_dict(json.load(original_file)) | ||
| # Sample supply data | ||
| SAMPLE_SUPPLY = Supply( | ||
| cups=TEST_CUPS, | ||
| distributor_code="0031", | ||
| point_type=5, | ||
| date_start=datetime(2020, 1, 1), | ||
| date_end=datetime(2025, 12, 31), | ||
| address="Test Address 123", | ||
| postal_code="28001", | ||
| province="Madrid", | ||
| municipality="Madrid", | ||
| distributor="Test Distributor", | ||
| ) | ||
| # Sample contract data | ||
| SAMPLE_CONTRACT = Contract( | ||
| distributor_code="0031", | ||
| date_start=datetime(2023, 1, 1), | ||
| date_end=datetime(2023, 12, 31), | ||
| power_p1=5.75, | ||
| power_p2=5.75, | ||
| marketer="Test Marketer", | ||
| ) | ||
| # Sample consumption data | ||
| SAMPLE_CONSUMPTIONS = [ | ||
| Consumption( | ||
| datetime=datetime(2023, 10, 14, hour), | ||
| delta_h=1.0, | ||
| value_kwh=0.5 + hour * 0.1, | ||
| surplus_kwh=0.0, | ||
| ) | ||
| for hour in range(24) | ||
| ] | ||
| # Sample maximeter data | ||
| SAMPLE_MAXPOWER = [ | ||
| MaxPower( | ||
| datetime=datetime(2023, 10, day), | ||
| value_kw=4.5 + day * 0.1, | ||
| ) | ||
| for day in range(1, 15) | ||
| ] | ||
| class TestEdataHelperIntegration: | ||
| """Integration tests for EdataHelper with mocked services.""" | ||
| def test_initialization_pvpc(self): | ||
| """Test EdataHelper initialization with PVPC pricing.""" | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| datadis_authorized_nif=TEST_NIF, | ||
| "USER", | ||
| "PASS", | ||
| "CUPS", | ||
| datadis_authorized_nif=None, | ||
| pricing_rules=PRICING_RULES_PVPC, | ||
| storage_dir_path=None, | ||
| data=data, | ||
| ) | ||
| helper.process_data() | ||
| # Test basic properties | ||
| assert helper._cups == TEST_CUPS | ||
| assert helper._scups == "1JN0F" | ||
| assert helper._authorized_nif == TEST_NIF | ||
| assert helper.pricing_rules == PRICING_RULES_PVPC | ||
| assert helper.enable_billing is True | ||
| assert helper.is_pvpc is True | ||
| # Test attributes initialization | ||
| assert len(helper.attributes) == len(ATTRIBUTES) | ||
| for attr in ATTRIBUTES: | ||
| assert helper.attributes[attr] is None | ||
| # Test that attributes and summary are the same object | ||
| assert helper.attributes is helper.summary | ||
| # Test services initialization | ||
| assert helper._supply_service is not None | ||
| assert helper._contract_service is not None | ||
| assert helper._consumption_service is not None | ||
| assert helper._maximeter_service is not None | ||
| assert helper._billing_service is not None | ||
| def test_initialization_fixed_pricing(self): | ||
| """Test EdataHelper initialization with fixed pricing.""" | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| pricing_rules=PRICING_RULES_FIXED, | ||
| ) | ||
| assert helper.enable_billing is True | ||
| assert helper.is_pvpc is False | ||
| assert helper._billing_service is not None | ||
| def test_initialization_no_billing(self): | ||
| """Test EdataHelper initialization without billing.""" | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| pricing_rules=None, | ||
| ) | ||
| assert helper.enable_billing is False | ||
| assert helper.is_pvpc is False | ||
| @freeze_time(AT_TIME) | ||
| @patch("edata.helpers.SupplyService") | ||
| @patch("edata.helpers.ContractService") | ||
| @patch("edata.helpers.ConsumptionService") | ||
| @patch("edata.helpers.MaximeterService") | ||
| @patch("edata.helpers.BillingService") | ||
| @pytest.mark.asyncio | ||
| async def test_update_successful_flow_pvpc( | ||
| self, | ||
| mock_billing_service, | ||
| mock_maximeter_service, | ||
| mock_consumption_service, | ||
| mock_contract_service, | ||
| mock_supply_service, | ||
| ): | ||
| """Test successful update flow with PVPC pricing.""" | ||
| # Setup mocks | ||
| mock_supply_instance = Mock() | ||
| mock_supply_instance.update_supplies = AsyncMock(return_value={"success": True}) | ||
| mock_supply_instance.validate_cups = AsyncMock(return_value=True) | ||
| mock_supply_instance.get_supply_by_cups = AsyncMock(return_value=SAMPLE_SUPPLY) | ||
| mock_supply_instance.get_supply_summary = AsyncMock( | ||
| return_value={"cups": TEST_CUPS} | ||
| ) | ||
| mock_supply_service.return_value = mock_supply_instance | ||
| mock_contract_instance = Mock() | ||
| mock_contract_instance.update_contracts = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_contract_instance.get_contract_summary = AsyncMock( | ||
| return_value={ | ||
| "contract_p1_kW": 5.75, | ||
| "contract_p2_kW": 5.75, | ||
| } | ||
| ) | ||
| mock_contract_service.return_value = mock_contract_instance | ||
| mock_consumption_instance = Mock() | ||
| mock_consumption_instance.update_consumption_range_by_months = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_consumption_instance.get_consumption_summary = AsyncMock( | ||
| return_value={ | ||
| "yesterday_kWh": 12.5, | ||
| "month_kWh": 350.0, | ||
| "last_month_kWh": 340.0, | ||
| "last_registered_date": datetime(2023, 10, 14, 23), | ||
| } | ||
| ) | ||
| mock_consumption_service.return_value = mock_consumption_instance | ||
| mock_maximeter_instance = Mock() | ||
| mock_maximeter_instance.update_maxpower_range_by_months = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_maximeter_instance.get_maximeter_summary = AsyncMock( | ||
| return_value={ | ||
| "max_power_kW": 5.8, | ||
| "max_power_date": datetime(2023, 10, 10), | ||
| "max_power_mean_kW": 4.5, | ||
| "max_power_90perc_kW": 5.2, | ||
| } | ||
| ) | ||
| mock_maximeter_service.return_value = mock_maximeter_instance | ||
| mock_billing_instance = Mock() | ||
| mock_billing_instance.update_pvpc_prices = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_billing_instance.update_missing_costs = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_billing_instance.get_billing_summary = AsyncMock( | ||
| return_value={ | ||
| "month_€": 45.67, | ||
| "last_month_€": 43.21, | ||
| } | ||
| ) | ||
| mock_billing_service.return_value = mock_billing_instance | ||
| # Test update | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| pricing_rules=PRICING_RULES_PVPC, | ||
| ) | ||
| date_from = datetime(2023, 1, 1) | ||
| date_to = datetime(2023, 10, 15) | ||
| result = await helper.update(date_from=date_from, date_to=date_to) | ||
| # Verify result | ||
| assert result is True | ||
| # Verify service calls | ||
| mock_supply_instance.update_supplies.assert_called_once_with( | ||
| authorized_nif=None | ||
| ) | ||
| mock_supply_instance.validate_cups.assert_called_once_with(TEST_CUPS) | ||
| mock_supply_instance.get_supply_by_cups.assert_called_once_with(TEST_CUPS) | ||
| mock_contract_instance.update_contracts.assert_called_once_with( | ||
| cups=TEST_CUPS, distributor_code="0031", authorized_nif=None | ||
| ) | ||
| mock_consumption_instance.update_consumption_range_by_months.assert_called_once_with( | ||
| cups=TEST_CUPS, | ||
| distributor_code="0031", | ||
| start_date=date_from, # Use the original date_from since it's after supply start | ||
| end_date=date_to, | ||
| measurement_type="0", | ||
| point_type=5, | ||
| authorized_nif=None, | ||
| ) | ||
| mock_maximeter_instance.update_maxpower_range_by_months.assert_called_once() | ||
| mock_billing_instance.update_pvpc_prices.assert_called_once() | ||
| mock_billing_instance.update_missing_costs.assert_called_once() | ||
| # Verify summary attributes | ||
| assert helper.attributes["cups"] == TEST_CUPS | ||
| assert helper.attributes["contract_p1_kW"] == 5.75 | ||
| assert helper.attributes["contract_p2_kW"] == 5.75 | ||
| assert helper.attributes["yesterday_kWh"] == 12.5 | ||
| assert helper.attributes["month_kWh"] == 350.0 | ||
| assert helper.attributes["last_month_kWh"] == 340.0 | ||
| assert helper.attributes["max_power_kW"] == 5.8 | ||
| assert helper.attributes["month_€"] == 45.67 | ||
| assert helper.attributes["last_month_€"] == 43.21 | ||
| @freeze_time(AT_TIME) | ||
| @patch("edata.helpers.SupplyService") | ||
| @patch("edata.helpers.ContractService") | ||
| @patch("edata.helpers.ConsumptionService") | ||
| @patch("edata.helpers.MaximeterService") | ||
| @patch("edata.helpers.BillingService") | ||
| @pytest.mark.asyncio | ||
| async def test_update_with_service_failures( | ||
| self, | ||
| mock_billing_service, | ||
| mock_maximeter_service, | ||
| mock_consumption_service, | ||
| mock_contract_service, | ||
| mock_supply_service, | ||
| ): | ||
| """Test update flow with some service failures.""" | ||
| # Setup mocks with some failures | ||
| mock_supply_instance = Mock() | ||
| mock_supply_instance.update_supplies = AsyncMock(return_value={"success": True}) | ||
| mock_supply_instance.validate_cups = AsyncMock(return_value=True) | ||
| mock_supply_instance.get_supply_by_cups = AsyncMock(return_value=SAMPLE_SUPPLY) | ||
| mock_supply_instance.get_supply_summary = AsyncMock( | ||
| return_value={"cups": TEST_CUPS} | ||
| ) | ||
| mock_supply_service.return_value = mock_supply_instance | ||
| mock_contract_instance = Mock() | ||
| mock_contract_instance.update_contracts = AsyncMock( | ||
| return_value={"success": False, "error": "Contract API down"} | ||
| ) | ||
| mock_contract_instance.get_contract_summary = AsyncMock(return_value={}) | ||
| mock_contract_service.return_value = mock_contract_instance | ||
| mock_consumption_instance = Mock() | ||
| mock_consumption_instance.update_consumption_range_by_months = AsyncMock( | ||
| return_value={"success": False} | ||
| ) | ||
| mock_consumption_instance.get_consumption_summary = AsyncMock(return_value={}) | ||
| mock_consumption_service.return_value = mock_consumption_instance | ||
| mock_maximeter_instance = Mock() | ||
| mock_maximeter_instance.update_maxpower_range_by_months = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_maximeter_instance.get_maximeter_summary = AsyncMock( | ||
| return_value={"max_power_kW": 5.8} | ||
| ) | ||
| mock_maximeter_service.return_value = mock_maximeter_instance | ||
| mock_billing_instance = Mock() | ||
| mock_billing_instance.update_pvpc_prices = AsyncMock( | ||
| return_value={"success": False, "error": "PVPC API error"} | ||
| ) | ||
| mock_billing_instance.get_billing_summary = AsyncMock(return_value={}) | ||
| mock_billing_service.return_value = mock_billing_instance | ||
| # Test update | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| pricing_rules=PRICING_RULES_PVPC, | ||
| ) | ||
| result = await helper.update() | ||
| # Update should still succeed even with some service failures | ||
| assert result is True | ||
| # Verify summary attributes include successful services | ||
| assert helper.attributes["cups"] == TEST_CUPS | ||
| assert helper.attributes["max_power_kW"] == 5.8 | ||
| # Failed services should have None values | ||
| assert helper.attributes["contract_p1_kW"] is None | ||
| assert helper.attributes["yesterday_kWh"] is None | ||
| @patch("edata.helpers.SupplyService") | ||
| @pytest.mark.asyncio | ||
| async def test_update_supply_failure(self, mock_supply_service): | ||
| """Test update with supply service failure.""" | ||
| mock_supply_instance = Mock() | ||
| mock_supply_instance.update_supplies.return_value = { | ||
| "success": False, | ||
| "error": "Authentication failed", | ||
| } | ||
| mock_supply_service.return_value = mock_supply_instance | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| ) | ||
| result = await helper.update() | ||
| # Should fail if supplies can't be updated | ||
| assert result is False | ||
| @patch("edata.helpers.SupplyService") | ||
| @pytest.mark.asyncio | ||
| async def test_update_cups_not_found(self, mock_supply_service): | ||
| """Test update when CUPS is not found in account.""" | ||
| mock_supply_instance = Mock() | ||
| mock_supply_instance.update_supplies.return_value = {"success": True} | ||
| mock_supply_instance.validate_cups.return_value = False | ||
| mock_supply_service.return_value = mock_supply_instance | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| ) | ||
| result = await helper.update() | ||
| # Should fail if CUPS is not found | ||
| assert result is False | ||
| @patch("edata.helpers.SupplyService") | ||
| @patch("edata.helpers.ContractService") | ||
| @patch("edata.helpers.ConsumptionService") | ||
| @patch("edata.helpers.MaximeterService") | ||
| def test_calculate_summary_attributes_error_handling( | ||
| self, | ||
| mock_maximeter_service, | ||
| mock_consumption_service, | ||
| mock_contract_service, | ||
| mock_supply_service, | ||
| ): | ||
| """Test error handling in summary calculation.""" | ||
| # Setup mock that raises exception | ||
| mock_supply_instance = Mock() | ||
| mock_supply_instance.get_supply_summary.side_effect = Exception( | ||
| "Database error" | ||
| ) | ||
| mock_supply_service.return_value = mock_supply_instance | ||
| mock_contract_instance = Mock() | ||
| mock_contract_instance.get_contract_summary.return_value = { | ||
| "contract_p1_kW": 5.75 | ||
| } | ||
| mock_contract_service.return_value = mock_contract_instance | ||
| mock_consumption_instance = Mock() | ||
| mock_consumption_instance.get_consumption_summary.return_value = { | ||
| "yesterday_kWh": 12.5 | ||
| } | ||
| mock_consumption_service.return_value = mock_consumption_instance | ||
| mock_maximeter_instance = Mock() | ||
| mock_maximeter_instance.get_maximeter_summary.return_value = { | ||
| "max_power_kW": 5.8 | ||
| } | ||
| mock_maximeter_service.return_value = mock_maximeter_instance | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| ) | ||
| # Should not raise exception | ||
| # Note: We can't actually test the exception handling easily in async context | ||
| # but we can test that all attributes are None initially | ||
| for attr in ATTRIBUTES: | ||
| assert helper.attributes[attr] is None | ||
| @pytest.mark.asyncio | ||
| async def test_numeric_value_rounding(self): | ||
| """Test that numeric values are properly rounded.""" | ||
| with patch("edata.helpers.SupplyService") as mock_supply_service, patch( | ||
| "edata.helpers.ContractService" | ||
| ) as mock_contract_service, patch( | ||
| "edata.helpers.ConsumptionService" | ||
| ) as mock_consumption_service, patch( | ||
| "edata.helpers.MaximeterService" | ||
| ) as mock_maximeter_service: | ||
| # Setup mocks with unrounded values | ||
| mock_supply_instance = Mock() | ||
| mock_supply_instance.get_supply_summary = AsyncMock( | ||
| return_value={"cups": TEST_CUPS} | ||
| ) | ||
| mock_supply_service.return_value = mock_supply_instance | ||
| mock_contract_instance = Mock() | ||
| mock_contract_instance.get_contract_summary = AsyncMock( | ||
| return_value={"contract_p1_kW": 5.7523456} | ||
| ) | ||
| mock_contract_service.return_value = mock_contract_instance | ||
| mock_consumption_instance = Mock() | ||
| mock_consumption_instance.get_consumption_summary = AsyncMock( | ||
| return_value={"yesterday_kWh": 12.54789} | ||
| ) | ||
| mock_consumption_service.return_value = mock_consumption_instance | ||
| mock_maximeter_instance = Mock() | ||
| mock_maximeter_instance.get_maximeter_summary = AsyncMock( | ||
| return_value={"max_power_kW": 5.87654321} | ||
| ) | ||
| mock_maximeter_service.return_value = mock_maximeter_instance | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| ) | ||
| await helper._calculate_summary_attributes() | ||
| # Check rounding | ||
| assert helper.attributes["contract_p1_kW"] == 5.75 | ||
| assert helper.attributes["yesterday_kWh"] == 12.55 | ||
| assert helper.attributes["max_power_kW"] == 5.88 | ||
| assert ( | ||
| helper.attributes["cups"] == TEST_CUPS | ||
| ) # String should not be affected | ||
| @pytest.mark.asyncio | ||
| async def test_date_range_adjustment(self): | ||
| """Test that date ranges are properly adjusted to supply validity period.""" | ||
| with patch("edata.helpers.SupplyService") as mock_supply_service, patch( | ||
| "edata.helpers.ContractService" | ||
| ) as mock_contract_service, patch( | ||
| "edata.helpers.ConsumptionService" | ||
| ) as mock_consumption_service, patch( | ||
| "edata.helpers.MaximeterService" | ||
| ) as mock_maximeter_service: | ||
| # Supply with limited date range | ||
| limited_supply = Supply( | ||
| cups=TEST_CUPS, | ||
| distributor_code="0031", | ||
| point_type=5, | ||
| date_start=datetime(2023, 6, 1), # Later start | ||
| date_end=datetime(2023, 9, 30), # Earlier end | ||
| address="Test Address", | ||
| postal_code="28001", | ||
| province="Madrid", | ||
| municipality="Madrid", | ||
| distributor="Test Distributor", | ||
| ) | ||
| mock_supply_instance = Mock() | ||
| mock_supply_instance.update_supplies = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_supply_instance.validate_cups = AsyncMock(return_value=True) | ||
| mock_supply_instance.get_supply_by_cups = AsyncMock( | ||
| return_value=limited_supply | ||
| ) | ||
| mock_supply_instance.get_supply_summary = AsyncMock( | ||
| return_value={"cups": TEST_CUPS} | ||
| ) | ||
| mock_supply_service.return_value = mock_supply_instance | ||
| mock_contract_instance = Mock() | ||
| mock_contract_instance.update_contracts = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_contract_instance.get_contract_summary = AsyncMock(return_value={}) | ||
| mock_contract_service.return_value = mock_contract_instance | ||
| mock_consumption_instance = Mock() | ||
| mock_consumption_instance.update_consumption_range_by_months = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_consumption_instance.get_consumption_summary = AsyncMock( | ||
| return_value={} | ||
| ) | ||
| mock_consumption_service.return_value = mock_consumption_instance | ||
| mock_maximeter_instance = Mock() | ||
| mock_maximeter_instance.update_maxpower_range_by_months = AsyncMock( | ||
| return_value={"success": True} | ||
| ) | ||
| mock_maximeter_instance.get_maximeter_summary = AsyncMock(return_value={}) | ||
| mock_maximeter_service.return_value = mock_maximeter_instance | ||
| helper = EdataHelper( | ||
| datadis_username=TEST_USERNAME, | ||
| datadis_password=TEST_PASSWORD, | ||
| cups=TEST_CUPS, | ||
| ) | ||
| # Request broader date range | ||
| result = await helper.update( | ||
| date_from=datetime(2023, 1, 1), date_to=datetime(2023, 12, 31) | ||
| ) | ||
| assert result is True | ||
| # Verify that consumption service was called with adjusted dates | ||
| mock_consumption_instance.update_consumption_range_by_months.assert_called_once_with( | ||
| cups=TEST_CUPS, | ||
| distributor_code="0031", | ||
| start_date=datetime(2023, 6, 1), # Adjusted to supply start | ||
| end_date=datetime(2023, 9, 30), # Adjusted to supply end | ||
| measurement_type="0", | ||
| point_type=5, | ||
| authorized_nif=None, | ||
| ) | ||
| # Compara ambos outputs con snapshot | ||
| assert { | ||
| "data": utils.serialize_dict(helper.data), | ||
| "attributes": utils.serialize_dict(helper.attributes), | ||
| } == snapshot |
+7
-16
@@ -1,22 +0,13 @@ | ||
| # Configuration files | ||
| include pyproject.toml | ||
| include Makefile | ||
| include .python-version | ||
| # Documentation | ||
| # Include the README | ||
| include *.md | ||
| # License | ||
| include LICENSE | ||
| # Include the license file | ||
| include LICENSE.txt | ||
| # Include tests (setuptools auto-includes package files) | ||
| recursive-include edata/tests *.py | ||
| # Include setup.py | ||
| include setup.py | ||
| # Exclude compiled files and caches | ||
| global-exclude *.pyc | ||
| global-exclude *.pyo | ||
| global-exclude __pycache__ | ||
| global-exclude .DS_Store | ||
| prune build | ||
| prune dist | ||
| prune *.egg-info | ||
| # Include the data files | ||
| recursive-include data * |
+722
-130
| Metadata-Version: 2.4 | ||
| Name: e-data | ||
| Version: 2.0.0b2 | ||
| Version: 1.3.1 | ||
| Summary: Python library for managing spanish energy data from various web providers | ||
| Author-email: VMG <vmayorg@outlook.es> | ||
| License-Expression: GPL-3.0-or-later | ||
| License: GNU GENERAL PUBLIC LICENSE | ||
| Version 3, 29 June 2007 | ||
| Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/> | ||
| Everyone is permitted to copy and distribute verbatim copies | ||
| of this license document, but changing it is not allowed. | ||
| Preamble | ||
| The GNU General Public License is a free, copyleft license for | ||
| software and other kinds of works. | ||
| The licenses for most software and other practical works are designed | ||
| to take away your freedom to share and change the works. By contrast, | ||
| the GNU General Public License is intended to guarantee your freedom to | ||
| share and change all versions of a program--to make sure it remains free | ||
| software for all its users. We, the Free Software Foundation, use the | ||
| GNU General Public License for most of our software; it applies also to | ||
| any other work released this way by its authors. You can apply it to | ||
| your programs, too. | ||
| When we speak of free software, we are referring to freedom, not | ||
| price. Our General Public Licenses are designed to make sure that you | ||
| have the freedom to distribute copies of free software (and charge for | ||
| them if you wish), that you receive source code or can get it if you | ||
| want it, that you can change the software or use pieces of it in new | ||
| free programs, and that you know you can do these things. | ||
| To protect your rights, we need to prevent others from denying you | ||
| these rights or asking you to surrender the rights. Therefore, you have | ||
| certain responsibilities if you distribute copies of the software, or if | ||
| you modify it: responsibilities to respect the freedom of others. | ||
| For example, if you distribute copies of such a program, whether | ||
| gratis or for a fee, you must pass on to the recipients the same | ||
| freedoms that you received. You must make sure that they, too, receive | ||
| or can get the source code. And you must show them these terms so they | ||
| know their rights. | ||
| Developers that use the GNU GPL protect your rights with two steps: | ||
| (1) assert copyright on the software, and (2) offer you this License | ||
| giving you legal permission to copy, distribute and/or modify it. | ||
| For the developers' and authors' protection, the GPL clearly explains | ||
| that there is no warranty for this free software. For both users' and | ||
| authors' sake, the GPL requires that modified versions be marked as | ||
| changed, so that their problems will not be attributed erroneously to | ||
| authors of previous versions. | ||
| Some devices are designed to deny users access to install or run | ||
| modified versions of the software inside them, although the manufacturer | ||
| can do so. This is fundamentally incompatible with the aim of | ||
| protecting users' freedom to change the software. The systematic | ||
| pattern of such abuse occurs in the area of products for individuals to | ||
| use, which is precisely where it is most unacceptable. Therefore, we | ||
| have designed this version of the GPL to prohibit the practice for those | ||
| products. If such problems arise substantially in other domains, we | ||
| stand ready to extend this provision to those domains in future versions | ||
| of the GPL, as needed to protect the freedom of users. | ||
| Finally, every program is threatened constantly by software patents. | ||
| States should not allow patents to restrict development and use of | ||
| software on general-purpose computers, but in those that do, we wish to | ||
| avoid the special danger that patents applied to a free program could | ||
| make it effectively proprietary. To prevent this, the GPL assures that | ||
| patents cannot be used to render the program non-free. | ||
| The precise terms and conditions for copying, distribution and | ||
| modification follow. | ||
| TERMS AND CONDITIONS | ||
| 0. Definitions. | ||
| "This License" refers to version 3 of the GNU General Public License. | ||
| "Copyright" also means copyright-like laws that apply to other kinds of | ||
| works, such as semiconductor masks. | ||
| "The Program" refers to any copyrightable work licensed under this | ||
| License. Each licensee is addressed as "you". "Licensees" and | ||
| "recipients" may be individuals or organizations. | ||
| To "modify" a work means to copy from or adapt all or part of the work | ||
| in a fashion requiring copyright permission, other than the making of an | ||
| exact copy. The resulting work is called a "modified version" of the | ||
| earlier work or a work "based on" the earlier work. | ||
| A "covered work" means either the unmodified Program or a work based | ||
| on the Program. | ||
| To "propagate" a work means to do anything with it that, without | ||
| permission, would make you directly or secondarily liable for | ||
| infringement under applicable copyright law, except executing it on a | ||
| computer or modifying a private copy. Propagation includes copying, | ||
| distribution (with or without modification), making available to the | ||
| public, and in some countries other activities as well. | ||
| To "convey" a work means any kind of propagation that enables other | ||
| parties to make or receive copies. Mere interaction with a user through | ||
| a computer network, with no transfer of a copy, is not conveying. | ||
| An interactive user interface displays "Appropriate Legal Notices" | ||
| to the extent that it includes a convenient and prominently visible | ||
| feature that (1) displays an appropriate copyright notice, and (2) | ||
| tells the user that there is no warranty for the work (except to the | ||
| extent that warranties are provided), that licensees may convey the | ||
| work under this License, and how to view a copy of this License. If | ||
| the interface presents a list of user commands or options, such as a | ||
| menu, a prominent item in the list meets this criterion. | ||
| 1. Source Code. | ||
| The "source code" for a work means the preferred form of the work | ||
| for making modifications to it. "Object code" means any non-source | ||
| form of a work. | ||
| A "Standard Interface" means an interface that either is an official | ||
| standard defined by a recognized standards body, or, in the case of | ||
| interfaces specified for a particular programming language, one that | ||
| is widely used among developers working in that language. | ||
| The "System Libraries" of an executable work include anything, other | ||
| than the work as a whole, that (a) is included in the normal form of | ||
| packaging a Major Component, but which is not part of that Major | ||
| Component, and (b) serves only to enable use of the work with that | ||
| Major Component, or to implement a Standard Interface for which an | ||
| implementation is available to the public in source code form. A | ||
| "Major Component", in this context, means a major essential component | ||
| (kernel, window system, and so on) of the specific operating system | ||
| (if any) on which the executable work runs, or a compiler used to | ||
| produce the work, or an object code interpreter used to run it. | ||
| The "Corresponding Source" for a work in object code form means all | ||
| the source code needed to generate, install, and (for an executable | ||
| work) run the object code and to modify the work, including scripts to | ||
| control those activities. However, it does not include the work's | ||
| System Libraries, or general-purpose tools or generally available free | ||
| programs which are used unmodified in performing those activities but | ||
| which are not part of the work. For example, Corresponding Source | ||
| includes interface definition files associated with source files for | ||
| the work, and the source code for shared libraries and dynamically | ||
| linked subprograms that the work is specifically designed to require, | ||
| such as by intimate data communication or control flow between those | ||
| subprograms and other parts of the work. | ||
| The Corresponding Source need not include anything that users | ||
| can regenerate automatically from other parts of the Corresponding | ||
| Source. | ||
| The Corresponding Source for a work in source code form is that | ||
| same work. | ||
| 2. Basic Permissions. | ||
| All rights granted under this License are granted for the term of | ||
| copyright on the Program, and are irrevocable provided the stated | ||
| conditions are met. This License explicitly affirms your unlimited | ||
| permission to run the unmodified Program. The output from running a | ||
| covered work is covered by this License only if the output, given its | ||
| content, constitutes a covered work. This License acknowledges your | ||
| rights of fair use or other equivalent, as provided by copyright law. | ||
| You may make, run and propagate covered works that you do not | ||
| convey, without conditions so long as your license otherwise remains | ||
| in force. You may convey covered works to others for the sole purpose | ||
| of having them make modifications exclusively for you, or provide you | ||
| with facilities for running those works, provided that you comply with | ||
| the terms of this License in conveying all material for which you do | ||
| not control copyright. Those thus making or running the covered works | ||
| for you must do so exclusively on your behalf, under your direction | ||
| and control, on terms that prohibit them from making any copies of | ||
| your copyrighted material outside their relationship with you. | ||
| Conveying under any other circumstances is permitted solely under | ||
| the conditions stated below. Sublicensing is not allowed; section 10 | ||
| makes it unnecessary. | ||
| 3. Protecting Users' Legal Rights From Anti-Circumvention Law. | ||
| No covered work shall be deemed part of an effective technological | ||
| measure under any applicable law fulfilling obligations under article | ||
| 11 of the WIPO copyright treaty adopted on 20 December 1996, or | ||
| similar laws prohibiting or restricting circumvention of such | ||
| measures. | ||
| When you convey a covered work, you waive any legal power to forbid | ||
| circumvention of technological measures to the extent such circumvention | ||
| is effected by exercising rights under this License with respect to | ||
| the covered work, and you disclaim any intention to limit operation or | ||
| modification of the work as a means of enforcing, against the work's | ||
| users, your or third parties' legal rights to forbid circumvention of | ||
| technological measures. | ||
| 4. Conveying Verbatim Copies. | ||
| You may convey verbatim copies of the Program's source code as you | ||
| receive it, in any medium, provided that you conspicuously and | ||
| appropriately publish on each copy an appropriate copyright notice; | ||
| keep intact all notices stating that this License and any | ||
| non-permissive terms added in accord with section 7 apply to the code; | ||
| keep intact all notices of the absence of any warranty; and give all | ||
| recipients a copy of this License along with the Program. | ||
| You may charge any price or no price for each copy that you convey, | ||
| and you may offer support or warranty protection for a fee. | ||
| 5. Conveying Modified Source Versions. | ||
| You may convey a work based on the Program, or the modifications to | ||
| produce it from the Program, in the form of source code under the | ||
| terms of section 4, provided that you also meet all of these conditions: | ||
| a) The work must carry prominent notices stating that you modified | ||
| it, and giving a relevant date. | ||
| b) The work must carry prominent notices stating that it is | ||
| released under this License and any conditions added under section | ||
| 7. This requirement modifies the requirement in section 4 to | ||
| "keep intact all notices". | ||
| c) You must license the entire work, as a whole, under this | ||
| License to anyone who comes into possession of a copy. This | ||
| License will therefore apply, along with any applicable section 7 | ||
| additional terms, to the whole of the work, and all its parts, | ||
| regardless of how they are packaged. This License gives no | ||
| permission to license the work in any other way, but it does not | ||
| invalidate such permission if you have separately received it. | ||
| d) If the work has interactive user interfaces, each must display | ||
| Appropriate Legal Notices; however, if the Program has interactive | ||
| interfaces that do not display Appropriate Legal Notices, your | ||
| work need not make them do so. | ||
| A compilation of a covered work with other separate and independent | ||
| works, which are not by their nature extensions of the covered work, | ||
| and which are not combined with it such as to form a larger program, | ||
| in or on a volume of a storage or distribution medium, is called an | ||
| "aggregate" if the compilation and its resulting copyright are not | ||
| used to limit the access or legal rights of the compilation's users | ||
| beyond what the individual works permit. Inclusion of a covered work | ||
| in an aggregate does not cause this License to apply to the other | ||
| parts of the aggregate. | ||
| 6. Conveying Non-Source Forms. | ||
| You may convey a covered work in object code form under the terms | ||
| of sections 4 and 5, provided that you also convey the | ||
| machine-readable Corresponding Source under the terms of this License, | ||
| in one of these ways: | ||
| a) Convey the object code in, or embodied in, a physical product | ||
| (including a physical distribution medium), accompanied by the | ||
| Corresponding Source fixed on a durable physical medium | ||
| customarily used for software interchange. | ||
| b) Convey the object code in, or embodied in, a physical product | ||
| (including a physical distribution medium), accompanied by a | ||
| written offer, valid for at least three years and valid for as | ||
| long as you offer spare parts or customer support for that product | ||
| model, to give anyone who possesses the object code either (1) a | ||
| copy of the Corresponding Source for all the software in the | ||
| product that is covered by this License, on a durable physical | ||
| medium customarily used for software interchange, for a price no | ||
| more than your reasonable cost of physically performing this | ||
| conveying of source, or (2) access to copy the | ||
| Corresponding Source from a network server at no charge. | ||
| c) Convey individual copies of the object code with a copy of the | ||
| written offer to provide the Corresponding Source. This | ||
| alternative is allowed only occasionally and noncommercially, and | ||
| only if you received the object code with such an offer, in accord | ||
| with subsection 6b. | ||
| d) Convey the object code by offering access from a designated | ||
| place (gratis or for a charge), and offer equivalent access to the | ||
| Corresponding Source in the same way through the same place at no | ||
| further charge. You need not require recipients to copy the | ||
| Corresponding Source along with the object code. If the place to | ||
| copy the object code is a network server, the Corresponding Source | ||
| may be on a different server (operated by you or a third party) | ||
| that supports equivalent copying facilities, provided you maintain | ||
| clear directions next to the object code saying where to find the | ||
| Corresponding Source. Regardless of what server hosts the | ||
| Corresponding Source, you remain obligated to ensure that it is | ||
| available for as long as needed to satisfy these requirements. | ||
| e) Convey the object code using peer-to-peer transmission, provided | ||
| you inform other peers where the object code and Corresponding | ||
| Source of the work are being offered to the general public at no | ||
| charge under subsection 6d. | ||
| A separable portion of the object code, whose source code is excluded | ||
| from the Corresponding Source as a System Library, need not be | ||
| included in conveying the object code work. | ||
| A "User Product" is either (1) a "consumer product", which means any | ||
| tangible personal property which is normally used for personal, family, | ||
| or household purposes, or (2) anything designed or sold for incorporation | ||
| into a dwelling. In determining whether a product is a consumer product, | ||
| doubtful cases shall be resolved in favor of coverage. For a particular | ||
| product received by a particular user, "normally used" refers to a | ||
| typical or common use of that class of product, regardless of the status | ||
| of the particular user or of the way in which the particular user | ||
| actually uses, or expects or is expected to use, the product. A product | ||
| is a consumer product regardless of whether the product has substantial | ||
| commercial, industrial or non-consumer uses, unless such uses represent | ||
| the only significant mode of use of the product. | ||
| "Installation Information" for a User Product means any methods, | ||
| procedures, authorization keys, or other information required to install | ||
| and execute modified versions of a covered work in that User Product from | ||
| a modified version of its Corresponding Source. The information must | ||
| suffice to ensure that the continued functioning of the modified object | ||
| code is in no case prevented or interfered with solely because | ||
| modification has been made. | ||
| If you convey an object code work under this section in, or with, or | ||
| specifically for use in, a User Product, and the conveying occurs as | ||
| part of a transaction in which the right of possession and use of the | ||
| User Product is transferred to the recipient in perpetuity or for a | ||
| fixed term (regardless of how the transaction is characterized), the | ||
| Corresponding Source conveyed under this section must be accompanied | ||
| by the Installation Information. But this requirement does not apply | ||
| if neither you nor any third party retains the ability to install | ||
| modified object code on the User Product (for example, the work has | ||
| been installed in ROM). | ||
| The requirement to provide Installation Information does not include a | ||
| requirement to continue to provide support service, warranty, or updates | ||
| for a work that has been modified or installed by the recipient, or for | ||
| the User Product in which it has been modified or installed. Access to a | ||
| network may be denied when the modification itself materially and | ||
| adversely affects the operation of the network or violates the rules and | ||
| protocols for communication across the network. | ||
| Corresponding Source conveyed, and Installation Information provided, | ||
| in accord with this section must be in a format that is publicly | ||
| documented (and with an implementation available to the public in | ||
| source code form), and must require no special password or key for | ||
| unpacking, reading or copying. | ||
| 7. Additional Terms. | ||
| "Additional permissions" are terms that supplement the terms of this | ||
| License by making exceptions from one or more of its conditions. | ||
| Additional permissions that are applicable to the entire Program shall | ||
| be treated as though they were included in this License, to the extent | ||
| that they are valid under applicable law. If additional permissions | ||
| apply only to part of the Program, that part may be used separately | ||
| under those permissions, but the entire Program remains governed by | ||
| this License without regard to the additional permissions. | ||
| When you convey a copy of a covered work, you may at your option | ||
| remove any additional permissions from that copy, or from any part of | ||
| it. (Additional permissions may be written to require their own | ||
| removal in certain cases when you modify the work.) You may place | ||
| additional permissions on material, added by you to a covered work, | ||
| for which you have or can give appropriate copyright permission. | ||
| Notwithstanding any other provision of this License, for material you | ||
| add to a covered work, you may (if authorized by the copyright holders of | ||
| that material) supplement the terms of this License with terms: | ||
| a) Disclaiming warranty or limiting liability differently from the | ||
| terms of sections 15 and 16 of this License; or | ||
| b) Requiring preservation of specified reasonable legal notices or | ||
| author attributions in that material or in the Appropriate Legal | ||
| Notices displayed by works containing it; or | ||
| c) Prohibiting misrepresentation of the origin of that material, or | ||
| requiring that modified versions of such material be marked in | ||
| reasonable ways as different from the original version; or | ||
| d) Limiting the use for publicity purposes of names of licensors or | ||
| authors of the material; or | ||
| e) Declining to grant rights under trademark law for use of some | ||
| trade names, trademarks, or service marks; or | ||
| f) Requiring indemnification of licensors and authors of that | ||
| material by anyone who conveys the material (or modified versions of | ||
| it) with contractual assumptions of liability to the recipient, for | ||
| any liability that these contractual assumptions directly impose on | ||
| those licensors and authors. | ||
| All other non-permissive additional terms are considered "further | ||
| restrictions" within the meaning of section 10. If the Program as you | ||
| received it, or any part of it, contains a notice stating that it is | ||
| governed by this License along with a term that is a further | ||
| restriction, you may remove that term. If a license document contains | ||
| a further restriction but permits relicensing or conveying under this | ||
| License, you may add to a covered work material governed by the terms | ||
| of that license document, provided that the further restriction does | ||
| not survive such relicensing or conveying. | ||
| If you add terms to a covered work in accord with this section, you | ||
| must place, in the relevant source files, a statement of the | ||
| additional terms that apply to those files, or a notice indicating | ||
| where to find the applicable terms. | ||
| Additional terms, permissive or non-permissive, may be stated in the | ||
| form of a separately written license, or stated as exceptions; | ||
| the above requirements apply either way. | ||
| 8. Termination. | ||
| You may not propagate or modify a covered work except as expressly | ||
| provided under this License. Any attempt otherwise to propagate or | ||
| modify it is void, and will automatically terminate your rights under | ||
| this License (including any patent licenses granted under the third | ||
| paragraph of section 11). | ||
| However, if you cease all violation of this License, then your | ||
| license from a particular copyright holder is reinstated (a) | ||
| provisionally, unless and until the copyright holder explicitly and | ||
| finally terminates your license, and (b) permanently, if the copyright | ||
| holder fails to notify you of the violation by some reasonable means | ||
| prior to 60 days after the cessation. | ||
| Moreover, your license from a particular copyright holder is | ||
| reinstated permanently if the copyright holder notifies you of the | ||
| violation by some reasonable means, this is the first time you have | ||
| received notice of violation of this License (for any work) from that | ||
| copyright holder, and you cure the violation prior to 30 days after | ||
| your receipt of the notice. | ||
| Termination of your rights under this section does not terminate the | ||
| licenses of parties who have received copies or rights from you under | ||
| this License. If your rights have been terminated and not permanently | ||
| reinstated, you do not qualify to receive new licenses for the same | ||
| material under section 10. | ||
| 9. Acceptance Not Required for Having Copies. | ||
| You are not required to accept this License in order to receive or | ||
| run a copy of the Program. Ancillary propagation of a covered work | ||
| occurring solely as a consequence of using peer-to-peer transmission | ||
| to receive a copy likewise does not require acceptance. However, | ||
| nothing other than this License grants you permission to propagate or | ||
| modify any covered work. These actions infringe copyright if you do | ||
| not accept this License. Therefore, by modifying or propagating a | ||
| covered work, you indicate your acceptance of this License to do so. | ||
| 10. Automatic Licensing of Downstream Recipients. | ||
| Each time you convey a covered work, the recipient automatically | ||
| receives a license from the original licensors, to run, modify and | ||
| propagate that work, subject to this License. You are not responsible | ||
| for enforcing compliance by third parties with this License. | ||
| An "entity transaction" is a transaction transferring control of an | ||
| organization, or substantially all assets of one, or subdividing an | ||
| organization, or merging organizations. If propagation of a covered | ||
| work results from an entity transaction, each party to that | ||
| transaction who receives a copy of the work also receives whatever | ||
| licenses to the work the party's predecessor in interest had or could | ||
| give under the previous paragraph, plus a right to possession of the | ||
| Corresponding Source of the work from the predecessor in interest, if | ||
| the predecessor has it or can get it with reasonable efforts. | ||
| You may not impose any further restrictions on the exercise of the | ||
| rights granted or affirmed under this License. For example, you may | ||
| not impose a license fee, royalty, or other charge for exercise of | ||
| rights granted under this License, and you may not initiate litigation | ||
| (including a cross-claim or counterclaim in a lawsuit) alleging that | ||
| any patent claim is infringed by making, using, selling, offering for | ||
| sale, or importing the Program or any portion of it. | ||
| 11. Patents. | ||
| A "contributor" is a copyright holder who authorizes use under this | ||
| License of the Program or a work on which the Program is based. The | ||
| work thus licensed is called the contributor's "contributor version". | ||
| A contributor's "essential patent claims" are all patent claims | ||
| owned or controlled by the contributor, whether already acquired or | ||
| hereafter acquired, that would be infringed by some manner, permitted | ||
| by this License, of making, using, or selling its contributor version, | ||
| but do not include claims that would be infringed only as a | ||
| consequence of further modification of the contributor version. For | ||
| purposes of this definition, "control" includes the right to grant | ||
| patent sublicenses in a manner consistent with the requirements of | ||
| this License. | ||
| Each contributor grants you a non-exclusive, worldwide, royalty-free | ||
| patent license under the contributor's essential patent claims, to | ||
| make, use, sell, offer for sale, import and otherwise run, modify and | ||
| propagate the contents of its contributor version. | ||
| In the following three paragraphs, a "patent license" is any express | ||
| agreement or commitment, however denominated, not to enforce a patent | ||
| (such as an express permission to practice a patent or covenant not to | ||
| sue for patent infringement). To "grant" such a patent license to a | ||
| party means to make such an agreement or commitment not to enforce a | ||
| patent against the party. | ||
| If you convey a covered work, knowingly relying on a patent license, | ||
| and the Corresponding Source of the work is not available for anyone | ||
| to copy, free of charge and under the terms of this License, through a | ||
| publicly available network server or other readily accessible means, | ||
| then you must either (1) cause the Corresponding Source to be so | ||
| available, or (2) arrange to deprive yourself of the benefit of the | ||
| patent license for this particular work, or (3) arrange, in a manner | ||
| consistent with the requirements of this License, to extend the patent | ||
| license to downstream recipients. "Knowingly relying" means you have | ||
| actual knowledge that, but for the patent license, your conveying the | ||
| covered work in a country, or your recipient's use of the covered work | ||
| in a country, would infringe one or more identifiable patents in that | ||
| country that you have reason to believe are valid. | ||
| If, pursuant to or in connection with a single transaction or | ||
| arrangement, you convey, or propagate by procuring conveyance of, a | ||
| covered work, and grant a patent license to some of the parties | ||
| receiving the covered work authorizing them to use, propagate, modify | ||
| or convey a specific copy of the covered work, then the patent license | ||
| you grant is automatically extended to all recipients of the covered | ||
| work and works based on it. | ||
| A patent license is "discriminatory" if it does not include within | ||
| the scope of its coverage, prohibits the exercise of, or is | ||
| conditioned on the non-exercise of one or more of the rights that are | ||
| specifically granted under this License. You may not convey a covered | ||
| work if you are a party to an arrangement with a third party that is | ||
| in the business of distributing software, under which you make payment | ||
| to the third party based on the extent of your activity of conveying | ||
| the work, and under which the third party grants, to any of the | ||
| parties who would receive the covered work from you, a discriminatory | ||
| patent license (a) in connection with copies of the covered work | ||
| conveyed by you (or copies made from those copies), or (b) primarily | ||
| for and in connection with specific products or compilations that | ||
| contain the covered work, unless you entered into that arrangement, | ||
| or that patent license was granted, prior to 28 March 2007. | ||
| Nothing in this License shall be construed as excluding or limiting | ||
| any implied license or other defenses to infringement that may | ||
| otherwise be available to you under applicable patent law. | ||
| 12. No Surrender of Others' Freedom. | ||
| If conditions are imposed on you (whether by court order, agreement or | ||
| otherwise) that contradict the conditions of this License, they do not | ||
| excuse you from the conditions of this License. If you cannot convey a | ||
| covered work so as to satisfy simultaneously your obligations under this | ||
| License and any other pertinent obligations, then as a consequence you may | ||
| not convey it at all. For example, if you agree to terms that obligate you | ||
| to collect a royalty for further conveying from those to whom you convey | ||
| the Program, the only way you could satisfy both those terms and this | ||
| License would be to refrain entirely from conveying the Program. | ||
| 13. Use with the GNU Affero General Public License. | ||
| Notwithstanding any other provision of this License, you have | ||
| permission to link or combine any covered work with a work licensed | ||
| under version 3 of the GNU Affero General Public License into a single | ||
| combined work, and to convey the resulting work. The terms of this | ||
| License will continue to apply to the part which is the covered work, | ||
| but the special requirements of the GNU Affero General Public License, | ||
| section 13, concerning interaction through a network will apply to the | ||
| combination as such. | ||
| 14. Revised Versions of this License. | ||
| The Free Software Foundation may publish revised and/or new versions of | ||
| the GNU General Public License from time to time. Such new versions will | ||
| be similar in spirit to the present version, but may differ in detail to | ||
| address new problems or concerns. | ||
| Each version is given a distinguishing version number. If the | ||
| Program specifies that a certain numbered version of the GNU General | ||
| Public License "or any later version" applies to it, you have the | ||
| option of following the terms and conditions either of that numbered | ||
| version or of any later version published by the Free Software | ||
| Foundation. If the Program does not specify a version number of the | ||
| GNU General Public License, you may choose any version ever published | ||
| by the Free Software Foundation. | ||
| If the Program specifies that a proxy can decide which future | ||
| versions of the GNU General Public License can be used, that proxy's | ||
| public statement of acceptance of a version permanently authorizes you | ||
| to choose that version for the Program. | ||
| Later license versions may give you additional or different | ||
| permissions. However, no additional obligations are imposed on any | ||
| author or copyright holder as a result of your choosing to follow a | ||
| later version. | ||
| 15. Disclaimer of Warranty. | ||
| THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY | ||
| APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT | ||
| HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY | ||
| OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, | ||
| THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR | ||
| PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM | ||
| IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF | ||
| ALL NECESSARY SERVICING, REPAIR OR CORRECTION. | ||
| 16. Limitation of Liability. | ||
| IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING | ||
| WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS | ||
| THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY | ||
| GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE | ||
| USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF | ||
| DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD | ||
| PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), | ||
| EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF | ||
| SUCH DAMAGES. | ||
| 17. Interpretation of Sections 15 and 16. | ||
| If the disclaimer of warranty and limitation of liability provided | ||
| above cannot be given local legal effect according to their terms, | ||
| reviewing courts shall apply local law that most closely approximates | ||
| an absolute waiver of all civil liability in connection with the | ||
| Program, unless a warranty or assumption of liability accompanies a | ||
| copy of the Program in return for a fee. | ||
| END OF TERMS AND CONDITIONS | ||
| How to Apply These Terms to Your New Programs | ||
| If you develop a new program, and you want it to be of the greatest | ||
| possible use to the public, the best way to achieve this is to make it | ||
| free software which everyone can redistribute and change under these terms. | ||
| To do so, attach the following notices to the program. It is safest | ||
| to attach them to the start of each source file to most effectively | ||
| state the exclusion of warranty; and each file should have at least | ||
| the "copyright" line and a pointer to where the full notice is found. | ||
| <one line to give the program's name and a brief idea of what it does.> | ||
| Copyright (C) <year> <name of author> | ||
| This program is free software: you can redistribute it and/or modify | ||
| it under the terms of the GNU General Public License as published by | ||
| the Free Software Foundation, either version 3 of the License, or | ||
| (at your option) any later version. | ||
| This program is distributed in the hope that it will be useful, | ||
| but WITHOUT ANY WARRANTY; without even the implied warranty of | ||
| MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | ||
| GNU General Public License for more details. | ||
| You should have received a copy of the GNU General Public License | ||
| along with this program. If not, see <https://www.gnu.org/licenses/>. | ||
| Also add information on how to contact you by electronic and paper mail. | ||
| If the program does terminal interaction, make it output a short | ||
| notice like this when it starts in an interactive mode: | ||
| <program> Copyright (C) <year> <name of author> | ||
| This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. | ||
| This is free software, and you are welcome to redistribute it | ||
| under certain conditions; type `show c' for details. | ||
| The hypothetical commands `show w' and `show c' should show the appropriate | ||
| parts of the General Public License. Of course, your program's commands | ||
| might be different; for a GUI interface, you would use an "about box". | ||
| You should also get your employer (if you work as a programmer) or school, | ||
| if any, to sign a "copyright disclaimer" for the program, if necessary. | ||
| For more information on this, and how to apply and follow the GNU GPL, see | ||
| <https://www.gnu.org/licenses/>. | ||
| The GNU General Public License does not permit incorporating your program | ||
| into proprietary programs. If your program is a subroutine library, you | ||
| may consider it more useful to permit linking proprietary applications with | ||
| the library. If this is what you want to do, use the GNU Lesser General | ||
| Public License instead of this License. But first, please read | ||
| <https://www.gnu.org/licenses/why-not-lgpl.html>. | ||
| Project-URL: Homepage, https://github.com/uvejota/python-edata | ||
| Project-URL: Repository, https://github.com/uvejota/python-edata | ||
| Project-URL: Issues, https://github.com/uvejota/python-edata/issues | ||
| Keywords: energy,data,spain,electricity,consumption | ||
| Classifier: Intended Audience :: Developers | ||
| Classifier: Programming Language :: Python :: 3 | ||
| Classifier: Programming Language :: Python :: 3.8 | ||
| Classifier: Programming Language :: Python :: 3.9 | ||
| Classifier: Programming Language :: Python :: 3.10 | ||
| Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3) | ||
| Classifier: Programming Language :: Python | ||
| Classifier: Programming Language :: Python :: 3.11 | ||
| Classifier: Programming Language :: Python :: 3.12 | ||
| Classifier: Programming Language :: Python :: Implementation :: CPython | ||
| Classifier: Topic :: Software Development :: Libraries :: Python Modules | ||
| Classifier: Topic :: Utilities | ||
| Requires-Python: >=3.8 | ||
| Classifier: Programming Language :: Python :: Implementation :: PyPy | ||
| Requires-Python: >=3.11.0 | ||
| Description-Content-Type: text/markdown | ||
| License-File: LICENSE | ||
| Requires-Dist: dateparser>=1.1.2 | ||
| Requires-Dist: freezegun>=1.2.1 | ||
| Requires-Dist: holidays>=0.14.2 | ||
| Requires-Dist: python-dateutil>=2.8.2 | ||
| Requires-Dist: pytest>=7.1.2 | ||
| Requires-Dist: python_dateutil>=2.8.2 | ||
| Requires-Dist: requests>=2.28.1 | ||
| Requires-Dist: voluptuous>=0.13.1 | ||
| Requires-Dist: Jinja2>=3.1.2 | ||
| Requires-Dist: pydantic>=2.0.0 | ||
| Requires-Dist: sqlmodel>=0.0.24 | ||
| Requires-Dist: aiosqlite>=0.20.0 | ||
| Requires-Dist: sqlalchemy[asyncio]>=2.0.0 | ||
| Requires-Dist: aiohttp>=3.8.0 | ||
| Requires-Dist: diskcache>=5.4.0 | ||
| Provides-Extra: dev | ||
| Requires-Dist: pytest>=7.1.2; extra == "dev" | ||
| Requires-Dist: freezegun>=1.2.1; extra == "dev" | ||
| Requires-Dist: twine>=4.0.0; extra == "dev" | ||
| Requires-Dist: build>=0.10.0; extra == "dev" | ||
| Requires-Dist: black>=22.0.0; extra == "dev" | ||
| Requires-Dist: flake8>=5.0.0; extra == "dev" | ||
| Requires-Dist: mypy>=1.0.0; extra == "dev" | ||
| Provides-Extra: test | ||
| Requires-Dist: pytest>=7.1.2; extra == "test" | ||
| Requires-Dist: freezegun>=1.2.1; extra == "test" | ||
| Requires-Dist: pytest-cov>=4.0.0; extra == "test" | ||
| Requires-Dist: pytest-asyncio>=0.20.0; extra == "test" | ||
| Requires-Dist: diskcache>=5.6.3 | ||
| Requires-Dist: aiohttp>=3.12.15 | ||
| Dynamic: license-file | ||
@@ -72,3 +724,3 @@ | ||
| ``` bash | ||
| make install-dev | ||
| pip install -r requirements.txt | ||
| ``` | ||
@@ -78,8 +730,7 @@ | ||
| El paquete utiliza una **arquitectura basada en servicios** con los siguientes módulos: | ||
| El paquete consta de tres módulos diferenciados: | ||
| * **Conectores** (módulo `connectors`), para definir los métodos de consulta a los diferentes proveedores: Datadis y REData. | ||
| * **Modelos** (módulo `models`), que definen las estructuras de datos usando Pydantic v2 para validación robusta. Incluye modelos para suministros, contratos, consumos, maxímetro, precios y base de datos. | ||
| * **Servicios** (módulo `services`), que implementan la lógica de negocio para cada dominio: gestión de suministros, contratos, consumos, maxímetro, facturación y base de datos SQLite. | ||
| * **Helper principal** (`helpers.py`), que orquesta todos los servicios y proporciona una interfaz simplificada. El `EdataHelper` permite descargar y procesar datos automáticamente, calculando más de 40 atributos de resumen. | ||
| * **Procesadores** (módulo `processors`), para procesar datos de consumo, maxímetro, o coste (tarificación). Ahora mismo consta de tres procesadores: `billing`, `consumption` y `maximeter`, además de algunas utilidades ubicadas en `utils`. Los procesadores deben heredar de la clase Processor definida en `base.py` | ||
| * **Ayudantes** (módulo `helpers`), para ayudar en el uso y gestión de los anteriores, presentando de momento un único ayudante llamado `EdataHelper` que te permite recopilar `X` días de datos (por defecto 365) y automáticamente procesarlos. Los datos son almacenados en la variable `data`, mientras que los atributos autocalculados son almacenados en la variable `attributes`. Por lo general, primero utilizan los conectores y luego procesan los datos, gestionando varias tareas de recuperación (principalmente para Datadis). | ||
@@ -91,49 +742,16 @@ Estos módulos corresponden a la siguiente estructura del paquete: | ||
| · __init__.py | ||
| · const.py # Constantes y definiciones de atributos | ||
| · utils.py # Utilidades generales | ||
| · helpers.py # Helper principal (EdataHelper) | ||
| · connectors/ | ||
| · __init__.py | ||
| · datadis.py # Conector API Datadis | ||
| · redata.py # Conector API REData (PVPC) | ||
| · models/ | ||
| · datadis.py | ||
| · redata.py | ||
| · processors/ | ||
| · __init__.py | ||
| · base.py # Modelos base con Pydantic | ||
| · supply.py # Modelo de suministros | ||
| · contract.py # Modelo de contratos | ||
| · consumption.py # Modelo de consumos | ||
| · maximeter.py # Modelo de maxímetro | ||
| · pricing.py # Modelo de reglas de precios | ||
| · database.py # Modelos para SQLite | ||
| · services/ | ||
| · __init__.py | ||
| · database.py # Servicio de base de datos SQLite | ||
| · supply.py # Gestión de suministros | ||
| · contract.py # Gestión de contratos | ||
| · consumption.py # Gestión de consumos | ||
| · maximeter.py # Gestión de maxímetro | ||
| · billing.py # Gestión de facturación | ||
| · scripts/ | ||
| · __init__.py | ||
| · dump.py # Script interactivo de descarga | ||
| · base.py | ||
| · billing.py | ||
| · consumption.py | ||
| · maximeter.py | ||
| · utils.py | ||
| · helpers.py | ||
| ``` | ||
| ## Script interactivo | ||
| El paquete incluye un script interactivo que facilita la descarga inicial de datos: | ||
| ```bash | ||
| # Ejecutar el script interactivo | ||
| python -m edata.scripts.dump | ||
| # Con directorio personalizado | ||
| python -m edata.scripts.dump --storage-dir /ruta/personalizada | ||
| ``` | ||
| Este script te guiará paso a paso para: | ||
| 1. Configurar credenciales de Datadis | ||
| 2. Seleccionar el suministro a procesar | ||
| 3. Definir el rango de fechas | ||
| 4. Descargar y almacenar todos los datos | ||
| ## Ejemplo de uso | ||
@@ -148,12 +766,11 @@ | ||
| ``` python | ||
| import asyncio | ||
| from datetime import datetime | ||
| import json | ||
| # importamos el modelo de reglas de tarificación | ||
| from edata.models.pricing import PricingRules | ||
| # importamos el helper principal | ||
| # importamos definiciones de datos que nos interesen | ||
| from edata.definitions import PricingRules | ||
| # importamos el ayudante | ||
| from edata.helpers import EdataHelper | ||
| # importamos utilidades para serialización | ||
| from edata import utils | ||
| # importamos el procesador de utilidades | ||
| from edata.processors import utils | ||
@@ -174,48 +791,23 @@ # Preparar reglas de tarificación (si se quiere) | ||
| async def main(): | ||
| # Instanciar el helper | ||
| # 'datadis_authorized_nif' permite indicar el NIF de la persona que nos autoriza a consultar su CUPS. | ||
| # 'storage_dir_path' permite especificar dónde almacenar la base de datos local | ||
| edata = EdataHelper( | ||
| "datadis_user", | ||
| "datadis_password", | ||
| "cups", | ||
| datadis_authorized_nif=None, | ||
| pricing_rules=PRICING_RULES_PVPC, # si se le pasa None, no aplica tarificación | ||
| storage_dir_path=None, # por defecto usa ./edata.storage/ | ||
| ) | ||
| # Instanciar el helper | ||
| # 'authorized_nif' permite indicar el NIF de la persona que nos autoriza a consultar su CUPS. | ||
| # 'data' permite "cargar" al helper datos anteriores (resultado edata.data de una ejecución anterior), para evitar volver a consultar los mismos. | ||
| edata = EdataHelper( | ||
| "datadis_user", | ||
| "datadis_password", | ||
| "cups", | ||
| datadis_authorized_nif=None, | ||
| pricing_rules=PRICING_RULES_PVPC, # si se le pasa None, no aplica tarificación | ||
| data=None, # aquí podríamos cargar datos anteriores | ||
| ) | ||
| # Solicitar actualización de todo el histórico (los datos se almacenan en SQLite) | ||
| success = await edata.update(date_from=datetime(1970, 1, 1), date_to=datetime.today()) | ||
| if success: | ||
| # Imprimir atributos resumen calculados | ||
| print("Atributos calculados:") | ||
| for key, value in edata.attributes.items(): | ||
| if value is not None: | ||
| print(f" {key}: {value}") | ||
| # Los datos se almacenan automáticamente en la base de datos SQLite | ||
| # ubicada en edata.storage/edata.db (por defecto) | ||
| print(f"\nDatos almacenados en la base de datos local") | ||
| else: | ||
| print("Error durante la actualización de datos") | ||
| # Solicitar actualización de todo el histórico (se almacena en edata.data) | ||
| edata.update(date_from=datetime(1970, 1, 1), date_to=datetime.today()) | ||
| # Ejecutar el ejemplo | ||
| if __name__ == "__main__": | ||
| asyncio.run(main()) | ||
| # volcamos todo lo obtenido a un fichero | ||
| with open("backup.json", "w") as file: | ||
| json.dump(utils.serialize_dict(edata.data), file) # se puede utilizar deserialize_dict para la posterior lectura del backup | ||
| # Imprimir atributos | ||
| print(edata.attributes) | ||
| ``` | ||
| ## Contribuir | ||
| Este proyecto está en desarrollo activo. Las contribuciones son bienvenidas: | ||
| 1. Fork del repositorio | ||
| 2. Crear una rama para tu feature: `git checkout -b feature/nueva-funcionalidad` | ||
| 3. Commit de tus cambios: `git commit -am 'Añadir nueva funcionalidad'` | ||
| 4. Push a la rama: `git push origin feature/nueva-funcionalidad` | ||
| 5. Crear un Pull Request | ||
| ## Licencia | ||
| Este proyecto está licenciado bajo GPLv3. Ver el archivo [LICENSE](LICENSE) para más detalles. |
+17
-78
@@ -7,99 +7,38 @@ [build-system] | ||
| name = "e-data" | ||
| version = "2.0.0b2" | ||
| version = "1.3.1" | ||
| description = "Python library for managing spanish energy data from various web providers" | ||
| readme = "README.md" | ||
| authors = [ | ||
| {name = "VMG", email = "vmayorg@outlook.es"}, | ||
| { name = "VMG", email = "vmayorg@outlook.es" } | ||
| ] | ||
| description = "Python library for managing spanish energy data from various web providers" | ||
| readme = "README.md" | ||
| license = "GPL-3.0-or-later" | ||
| requires-python = ">=3.8" | ||
| license = { file = "LICENSE" } | ||
| requires-python = ">=3.11.0" | ||
| classifiers = [ | ||
| "Intended Audience :: Developers", | ||
| "Programming Language :: Python :: 3", | ||
| "Programming Language :: Python :: 3.8", | ||
| "Programming Language :: Python :: 3.9", | ||
| "Programming Language :: Python :: 3.10", | ||
| "License :: OSI Approved :: GNU General Public License v3 (GPLv3)", | ||
| "Programming Language :: Python", | ||
| "Programming Language :: Python :: 3.11", | ||
| "Programming Language :: Python :: 3.12", | ||
| "Programming Language :: Python :: Implementation :: CPython", | ||
| "Topic :: Software Development :: Libraries :: Python Modules", | ||
| "Topic :: Utilities", | ||
| "Programming Language :: Python :: Implementation :: PyPy", | ||
| ] | ||
| keywords = ["energy", "data", "spain", "electricity", "consumption"] | ||
| dependencies = [ | ||
| "dateparser>=1.1.2", | ||
| "freezegun>=1.2.1", | ||
| "holidays>=0.14.2", | ||
| "python-dateutil>=2.8.2", | ||
| "pytest>=7.1.2", | ||
| "python_dateutil>=2.8.2", | ||
| "requests>=2.28.1", | ||
| "voluptuous>=0.13.1", | ||
| "Jinja2>=3.1.2", | ||
| "pydantic>=2.0.0", | ||
| "sqlmodel>=0.0.24", | ||
| "aiosqlite>=0.20.0", | ||
| "sqlalchemy[asyncio]>=2.0.0", | ||
| "aiohttp>=3.8.0", | ||
| "diskcache>=5.4.0", | ||
| "diskcache>=5.6.3", | ||
| "aiohttp>=3.12.15" | ||
| ] | ||
| [project.optional-dependencies] | ||
| dev = [ | ||
| "pytest>=7.1.2", | ||
| "freezegun>=1.2.1", | ||
| "twine>=4.0.0", | ||
| "build>=0.10.0", | ||
| "black>=22.0.0", | ||
| "flake8>=5.0.0", | ||
| "mypy>=1.0.0", | ||
| ] | ||
| test = [ | ||
| "pytest>=7.1.2", | ||
| "freezegun>=1.2.1", | ||
| "pytest-cov>=4.0.0", | ||
| "pytest-asyncio>=0.20.0", | ||
| ] | ||
| [project.urls] | ||
| Homepage = "https://github.com/uvejota/python-edata" | ||
| Repository = "https://github.com/uvejota/python-edata" | ||
| Issues = "https://github.com/uvejota/python-edata/issues" | ||
| [tool.setuptools.packages.find] | ||
| where = ["edata"] | ||
| [tool.setuptools] | ||
| include-package-data = true | ||
| [tool.setuptools.packages.find] | ||
| exclude = ["tests*", "*.tests*", "*.tests", "edata.tests*"] | ||
| [tool.setuptools.package-data] | ||
| edata = ["py.typed"] | ||
| # Configuración para herramientas de desarrollo | ||
| [tool.black] | ||
| line-length = 88 | ||
| target-version = ['py38'] | ||
| include = '\.pyi?$' | ||
| extend-exclude = ''' | ||
| /( | ||
| # directories | ||
| \.eggs | ||
| | \.git | ||
| | \.hg | ||
| | \.mypy_cache | ||
| | \.tox | ||
| | \.venv | ||
| | build | ||
| | dist | ||
| )/ | ||
| ''' | ||
| [tool.pytest.ini_options] | ||
| testpaths = ["edata/tests"] | ||
| python_files = ["test_*.py", "*_test.py"] | ||
| python_classes = ["Test*"] | ||
| python_functions = ["test_*"] | ||
| addopts = "-v --tb=short" | ||
| [tool.mypy] | ||
| python_version = "3.8" | ||
| warn_return_any = true | ||
| warn_unused_configs = true | ||
| disallow_untyped_defs = true |
+37
-97
@@ -22,3 +22,3 @@ [](https://pepy.tech/project/e-data) | ||
| ``` bash | ||
| make install-dev | ||
| pip install -r requirements.txt | ||
| ``` | ||
@@ -28,8 +28,7 @@ | ||
| El paquete utiliza una **arquitectura basada en servicios** con los siguientes módulos: | ||
| El paquete consta de tres módulos diferenciados: | ||
| * **Conectores** (módulo `connectors`), para definir los métodos de consulta a los diferentes proveedores: Datadis y REData. | ||
| * **Modelos** (módulo `models`), que definen las estructuras de datos usando Pydantic v2 para validación robusta. Incluye modelos para suministros, contratos, consumos, maxímetro, precios y base de datos. | ||
| * **Servicios** (módulo `services`), que implementan la lógica de negocio para cada dominio: gestión de suministros, contratos, consumos, maxímetro, facturación y base de datos SQLite. | ||
| * **Helper principal** (`helpers.py`), que orquesta todos los servicios y proporciona una interfaz simplificada. El `EdataHelper` permite descargar y procesar datos automáticamente, calculando más de 40 atributos de resumen. | ||
| * **Procesadores** (módulo `processors`), para procesar datos de consumo, maxímetro, o coste (tarificación). Ahora mismo consta de tres procesadores: `billing`, `consumption` y `maximeter`, además de algunas utilidades ubicadas en `utils`. Los procesadores deben heredar de la clase Processor definida en `base.py` | ||
| * **Ayudantes** (módulo `helpers`), para ayudar en el uso y gestión de los anteriores, presentando de momento un único ayudante llamado `EdataHelper` que te permite recopilar `X` días de datos (por defecto 365) y automáticamente procesarlos. Los datos son almacenados en la variable `data`, mientras que los atributos autocalculados son almacenados en la variable `attributes`. Por lo general, primero utilizan los conectores y luego procesan los datos, gestionando varias tareas de recuperación (principalmente para Datadis). | ||
@@ -41,49 +40,16 @@ Estos módulos corresponden a la siguiente estructura del paquete: | ||
| · __init__.py | ||
| · const.py # Constantes y definiciones de atributos | ||
| · utils.py # Utilidades generales | ||
| · helpers.py # Helper principal (EdataHelper) | ||
| · connectors/ | ||
| · __init__.py | ||
| · datadis.py # Conector API Datadis | ||
| · redata.py # Conector API REData (PVPC) | ||
| · models/ | ||
| · datadis.py | ||
| · redata.py | ||
| · processors/ | ||
| · __init__.py | ||
| · base.py # Modelos base con Pydantic | ||
| · supply.py # Modelo de suministros | ||
| · contract.py # Modelo de contratos | ||
| · consumption.py # Modelo de consumos | ||
| · maximeter.py # Modelo de maxímetro | ||
| · pricing.py # Modelo de reglas de precios | ||
| · database.py # Modelos para SQLite | ||
| · services/ | ||
| · __init__.py | ||
| · database.py # Servicio de base de datos SQLite | ||
| · supply.py # Gestión de suministros | ||
| · contract.py # Gestión de contratos | ||
| · consumption.py # Gestión de consumos | ||
| · maximeter.py # Gestión de maxímetro | ||
| · billing.py # Gestión de facturación | ||
| · scripts/ | ||
| · __init__.py | ||
| · dump.py # Script interactivo de descarga | ||
| · base.py | ||
| · billing.py | ||
| · consumption.py | ||
| · maximeter.py | ||
| · utils.py | ||
| · helpers.py | ||
| ``` | ||
| ## Script interactivo | ||
| El paquete incluye un script interactivo que facilita la descarga inicial de datos: | ||
| ```bash | ||
| # Ejecutar el script interactivo | ||
| python -m edata.scripts.dump | ||
| # Con directorio personalizado | ||
| python -m edata.scripts.dump --storage-dir /ruta/personalizada | ||
| ``` | ||
| Este script te guiará paso a paso para: | ||
| 1. Configurar credenciales de Datadis | ||
| 2. Seleccionar el suministro a procesar | ||
| 3. Definir el rango de fechas | ||
| 4. Descargar y almacenar todos los datos | ||
| ## Ejemplo de uso | ||
@@ -98,12 +64,11 @@ | ||
| ``` python | ||
| import asyncio | ||
| from datetime import datetime | ||
| import json | ||
| # importamos el modelo de reglas de tarificación | ||
| from edata.models.pricing import PricingRules | ||
| # importamos el helper principal | ||
| # importamos definiciones de datos que nos interesen | ||
| from edata.definitions import PricingRules | ||
| # importamos el ayudante | ||
| from edata.helpers import EdataHelper | ||
| # importamos utilidades para serialización | ||
| from edata import utils | ||
| # importamos el procesador de utilidades | ||
| from edata.processors import utils | ||
@@ -124,48 +89,23 @@ # Preparar reglas de tarificación (si se quiere) | ||
| async def main(): | ||
| # Instanciar el helper | ||
| # 'datadis_authorized_nif' permite indicar el NIF de la persona que nos autoriza a consultar su CUPS. | ||
| # 'storage_dir_path' permite especificar dónde almacenar la base de datos local | ||
| edata = EdataHelper( | ||
| "datadis_user", | ||
| "datadis_password", | ||
| "cups", | ||
| datadis_authorized_nif=None, | ||
| pricing_rules=PRICING_RULES_PVPC, # si se le pasa None, no aplica tarificación | ||
| storage_dir_path=None, # por defecto usa ./edata.storage/ | ||
| ) | ||
| # Instanciar el helper | ||
| # 'authorized_nif' permite indicar el NIF de la persona que nos autoriza a consultar su CUPS. | ||
| # 'data' permite "cargar" al helper datos anteriores (resultado edata.data de una ejecución anterior), para evitar volver a consultar los mismos. | ||
| edata = EdataHelper( | ||
| "datadis_user", | ||
| "datadis_password", | ||
| "cups", | ||
| datadis_authorized_nif=None, | ||
| pricing_rules=PRICING_RULES_PVPC, # si se le pasa None, no aplica tarificación | ||
| data=None, # aquí podríamos cargar datos anteriores | ||
| ) | ||
| # Solicitar actualización de todo el histórico (los datos se almacenan en SQLite) | ||
| success = await edata.update(date_from=datetime(1970, 1, 1), date_to=datetime.today()) | ||
| if success: | ||
| # Imprimir atributos resumen calculados | ||
| print("Atributos calculados:") | ||
| for key, value in edata.attributes.items(): | ||
| if value is not None: | ||
| print(f" {key}: {value}") | ||
| # Los datos se almacenan automáticamente en la base de datos SQLite | ||
| # ubicada en edata.storage/edata.db (por defecto) | ||
| print(f"\nDatos almacenados en la base de datos local") | ||
| else: | ||
| print("Error durante la actualización de datos") | ||
| # Solicitar actualización de todo el histórico (se almacena en edata.data) | ||
| edata.update(date_from=datetime(1970, 1, 1), date_to=datetime.today()) | ||
| # Ejecutar el ejemplo | ||
| if __name__ == "__main__": | ||
| asyncio.run(main()) | ||
| # volcamos todo lo obtenido a un fichero | ||
| with open("backup.json", "w") as file: | ||
| json.dump(utils.serialize_dict(edata.data), file) # se puede utilizar deserialize_dict para la posterior lectura del backup | ||
| # Imprimir atributos | ||
| print(edata.attributes) | ||
| ``` | ||
| ## Contribuir | ||
| Este proyecto está en desarrollo activo. Las contribuciones son bienvenidas: | ||
| 1. Fork del repositorio | ||
| 2. Crear una rama para tu feature: `git checkout -b feature/nueva-funcionalidad` | ||
| 3. Commit de tus cambios: `git commit -am 'Añadir nueva funcionalidad'` | ||
| 4. Push a la rama: `git push origin feature/nueva-funcionalidad` | ||
| 5. Crear un Pull Request | ||
| ## Licencia | ||
| Este proyecto está licenciado bajo GPLv3. Ver el archivo [LICENSE](LICENSE) para más detalles. |
Sorry, the diff of this file is not supported yet
| .python-version | ||
| LICENSE | ||
| MANIFEST.in | ||
| Makefile | ||
| README.md | ||
| pyproject.toml | ||
| edata/__init__.py | ||
| edata/const.py | ||
| edata/helpers.py | ||
| edata/utils.py | ||
| edata/connectors/__init__.py | ||
| edata/connectors/datadis.py | ||
| edata/connectors/redata.py | ||
| edata/models/__init__.py | ||
| edata/models/base.py | ||
| edata/models/consumption.py | ||
| edata/models/contract.py | ||
| edata/models/database.py | ||
| edata/models/maximeter.py | ||
| edata/models/pricing.py | ||
| edata/models/supply.py | ||
| edata/scripts/__init__.py | ||
| edata/scripts/__main__.py | ||
| edata/scripts/dump.py | ||
| edata/services/__init__.py | ||
| edata/services/billing.py | ||
| edata/services/consumption.py | ||
| edata/services/contract.py | ||
| edata/services/database.py | ||
| edata/services/maximeter.py | ||
| edata/services/supply.py | ||
| edata/tests/__init__.py | ||
| edata/tests/test_helpers.py | ||
| edata/tests/connectors/__init__.py | ||
| edata/tests/connectors/test_datadis_connector.py | ||
| edata/tests/connectors/test_redata_connector.py | ||
| edata/tests/services/__init__.py | ||
| edata/tests/services/test_billing_service.py | ||
| edata/tests/services/test_consumption_service.py | ||
| edata/tests/services/test_contract_service.py | ||
| edata/tests/services/test_database_service.py | ||
| edata/tests/services/test_maximeter_service.py | ||
| edata/tests/services/test_supply_service.py |
| """Constants file.""" | ||
| PROG_NAME = "edata" | ||
| DEFAULT_STORAGE_DIR = "edata.storage" | ||
| # Attributes definition for backward compatibility | ||
| ATTRIBUTES = { | ||
| "cups": None, | ||
| "contract_p1_kW": "kW", | ||
| "contract_p2_kW": "kW", | ||
| "yesterday_kWh": "kWh", | ||
| "yesterday_hours": "h", | ||
| "yesterday_p1_kWh": "kWh", | ||
| "yesterday_p2_kWh": "kWh", | ||
| "yesterday_p3_kWh": "kWh", | ||
| "yesterday_surplus_kWh": "kWh", | ||
| "yesterday_surplus_p1_kWh": "kWh", | ||
| "yesterday_surplus_p2_kWh": "kWh", | ||
| "yesterday_surplus_p3_kWh": "kWh", | ||
| "last_registered_date": None, | ||
| "last_registered_day_kWh": "kWh", | ||
| "last_registered_day_hours": "h", | ||
| "last_registered_day_p1_kWh": "kWh", | ||
| "last_registered_day_p2_kWh": "kWh", | ||
| "last_registered_day_p3_kWh": "kWh", | ||
| "last_registered_day_surplus_kWh": "kWh", | ||
| "last_registered_day_surplus_p1_kWh": "kWh", | ||
| "last_registered_day_surplus_p2_kWh": "kWh", | ||
| "last_registered_day_surplus_p3_kWh": "kWh", | ||
| "month_kWh": "kWh", | ||
| "month_daily_kWh": "kWh", | ||
| "month_days": "d", | ||
| "month_p1_kWh": "kWh", | ||
| "month_p2_kWh": "kWh", | ||
| "month_p3_kWh": "kWh", | ||
| "month_surplus_kWh": "kWh", | ||
| "month_surplus_p1_kWh": "kWh", | ||
| "month_surplus_p2_kWh": "kWh", | ||
| "month_surplus_p3_kWh": "kWh", | ||
| "month_€": "€", | ||
| "last_month_kWh": "kWh", | ||
| "last_month_daily_kWh": "kWh", | ||
| "last_month_days": "d", | ||
| "last_month_p1_kWh": "kWh", | ||
| "last_month_p2_kWh": "kWh", | ||
| "last_month_p3_kWh": "kWh", | ||
| "last_month_surplus_kWh": "kWh", | ||
| "last_month_surplus_p1_kWh": "kWh", | ||
| "last_month_surplus_p2_kWh": "kWh", | ||
| "last_month_surplus_p3_kWh": "kWh", | ||
| "last_month_€": "€", | ||
| "max_power_kW": "kW", | ||
| "max_power_date": None, | ||
| "max_power_mean_kW": "kW", | ||
| "max_power_90perc_kW": "kW", | ||
| } |
-312
| """A module for edata helpers.""" | ||
| import logging | ||
| from datetime import datetime | ||
| from typing import Any, Dict | ||
| from edata.connectors.datadis import DatadisConnector | ||
| from edata.const import ATTRIBUTES | ||
| from edata.models.pricing import PricingRules | ||
| from edata.services.billing import BillingService | ||
| from edata.services.consumption import ConsumptionService | ||
| from edata.services.contract import ContractService | ||
| from edata.services.maximeter import MaximeterService | ||
| from edata.services.supply import SupplyService | ||
| _LOGGER = logging.getLogger(__name__) | ||
| def acups(cups): | ||
| """Print an abbreviated and anonymized CUPS.""" | ||
| return cups[-5:] | ||
| class EdataHelper: | ||
| """Main EdataHelper class using service-based architecture.""" | ||
| def __init__( | ||
| self, | ||
| datadis_username: str, | ||
| datadis_password: str, | ||
| cups: str, | ||
| datadis_authorized_nif: str | None = None, | ||
| pricing_rules: PricingRules | None = None, | ||
| storage_dir_path: str | None = None, | ||
| enable_smart_fetch: bool = True, | ||
| ) -> None: | ||
| """Initialize EdataHelper with service-based architecture. | ||
| Args: | ||
| datadis_username: Datadis username | ||
| datadis_password: Datadis password | ||
| cups: CUPS identifier | ||
| datadis_authorized_nif: Optional authorized NIF | ||
| pricing_rules: Pricing configuration | ||
| storage_dir_path: Directory for database and cache storage | ||
| enable_smart_fetch: Enable smart fetching in datadis connector | ||
| """ | ||
| self._cups = cups | ||
| self._scups = acups(cups) | ||
| self._authorized_nif = datadis_authorized_nif | ||
| self._storage_dir = storage_dir_path | ||
| self.pricing_rules = pricing_rules | ||
| # Initialize summary attributes | ||
| self.summary: Dict[str, Any] = {} | ||
| for attr in ATTRIBUTES: | ||
| self.summary[attr] = None | ||
| # For backward compatibility, alias 'attributes' to 'summary' | ||
| self.attributes = self.summary | ||
| # Determine if using PVPC pricing | ||
| self.enable_billing = pricing_rules is not None | ||
| if self.enable_billing: | ||
| self.is_pvpc = not all( | ||
| getattr(pricing_rules, x, None) is not None | ||
| for x in ("p1_kwh_eur", "p2_kwh_eur", "p3_kwh_eur") | ||
| ) | ||
| else: | ||
| self.is_pvpc = False | ||
| # Create shared Datadis connector | ||
| self._datadis_connector = DatadisConnector( | ||
| username=datadis_username, | ||
| password=datadis_password, | ||
| enable_smart_fetch=enable_smart_fetch, | ||
| storage_path=storage_dir_path, | ||
| ) | ||
| # Initialize services with dependency injection | ||
| self._supply_service = SupplyService( | ||
| datadis_connector=self._datadis_connector, | ||
| storage_dir=storage_dir_path, | ||
| ) | ||
| self._contract_service = ContractService( | ||
| datadis_connector=self._datadis_connector, | ||
| storage_dir=storage_dir_path, | ||
| ) | ||
| self._consumption_service = ConsumptionService( | ||
| datadis_connector=self._datadis_connector, | ||
| storage_dir=storage_dir_path, | ||
| ) | ||
| self._maximeter_service = MaximeterService( | ||
| datadis_connector=self._datadis_connector, | ||
| storage_dir=storage_dir_path, | ||
| ) | ||
| if self.enable_billing: | ||
| self._billing_service = BillingService(storage_dir=storage_dir_path) | ||
| _LOGGER.info(f"EdataHelper initialized for CUPS {self._scups}") | ||
| @property | ||
| def datadis_connector(self) -> DatadisConnector: | ||
| """Get the shared Datadis connector instance.""" | ||
| return self._datadis_connector | ||
| async def update( | ||
| self, | ||
| date_from: datetime = datetime(1970, 1, 1), | ||
| date_to: datetime = datetime.today(), | ||
| ): | ||
| """Update all data and calculate summary attributes. | ||
| Args: | ||
| date_from: Start date for data updates | ||
| date_to: End date for data updates | ||
| incremental_update: Whether to update incrementally (deprecated, ignored) | ||
| """ | ||
| _LOGGER.info( | ||
| f"{self._scups}: Starting update from {date_from.date()} to {date_to.date()}" | ||
| ) | ||
| try: | ||
| # Step 1: Update supplies | ||
| _LOGGER.info(f"{self._scups}: Updating supplies") | ||
| supply_result = await self._supply_service.update_supplies( | ||
| authorized_nif=self._authorized_nif | ||
| ) | ||
| if not supply_result["success"]: | ||
| _LOGGER.error( | ||
| f"{self._scups}: Failed to update supplies: {supply_result.get('error', 'Unknown error')}" | ||
| ) | ||
| return False | ||
| # Validate that our CUPS exists | ||
| if not await self._supply_service.validate_cups(self._cups): | ||
| _LOGGER.error(f"{self._scups}: CUPS not found in account") | ||
| return False | ||
| _LOGGER.info(f"{self._scups}: CUPS validated successfully") | ||
| # Get supply information | ||
| supply = await self._supply_service.get_supply_by_cups(self._cups) | ||
| if not supply: | ||
| _LOGGER.error(f"{self._scups}: Could not retrieve supply details") | ||
| return False | ||
| distributor_code = supply.distributor_code | ||
| point_type = supply.point_type | ||
| _LOGGER.info( | ||
| f"{self._scups}: Supply dates from {supply.date_start.date()} to {supply.date_end.date()}" | ||
| ) | ||
| # Adjust date range to supply validity period | ||
| effective_start = max(date_from, supply.date_start) | ||
| effective_end = min(date_to, supply.date_end) | ||
| # Step 2: Update contracts | ||
| _LOGGER.info(f"{self._scups}: Updating contracts") | ||
| contract_result = await self._contract_service.update_contracts( | ||
| cups=self._cups, | ||
| distributor_code=distributor_code, | ||
| authorized_nif=self._authorized_nif, | ||
| ) | ||
| if not contract_result["success"]: | ||
| _LOGGER.warning( | ||
| f"{self._scups}: Contract update failed: {contract_result.get('error', 'Unknown error')}" | ||
| ) | ||
| # Step 3: Update consumptions in monthly chunks | ||
| _LOGGER.info(f"{self._scups}: Updating consumptions") | ||
| consumption_result = ( | ||
| await self._consumption_service.update_consumption_range_by_months( | ||
| cups=self._cups, | ||
| distributor_code=distributor_code, | ||
| start_date=effective_start, | ||
| end_date=effective_end, | ||
| measurement_type="0", | ||
| point_type=point_type, | ||
| authorized_nif=self._authorized_nif, | ||
| ) | ||
| ) | ||
| if not consumption_result["success"]: | ||
| _LOGGER.warning(f"{self._scups}: Consumption update failed") | ||
| # Step 4: Update maximeter data | ||
| _LOGGER.info(f"{self._scups}: Updating maximeter") | ||
| maximeter_result = ( | ||
| await self._maximeter_service.update_maxpower_range_by_months( | ||
| cups=self._cups, | ||
| distributor_code=distributor_code, | ||
| start_date=effective_start, | ||
| end_date=effective_end, | ||
| authorized_nif=self._authorized_nif, | ||
| ) | ||
| ) | ||
| if not maximeter_result["success"]: | ||
| _LOGGER.warning(f"{self._scups}: Maximeter update failed") | ||
| # Step 5: Update PVPC prices if needed | ||
| if self.enable_billing and self.is_pvpc: | ||
| _LOGGER.info(f"{self._scups}: Updating PVPC prices") | ||
| try: | ||
| pvpc_result = await self._billing_service.update_pvpc_prices( | ||
| start_date=effective_start, | ||
| end_date=effective_end, | ||
| is_ceuta_melilla=False, # Default to Peninsula | ||
| ) | ||
| if not pvpc_result["success"]: | ||
| _LOGGER.warning( | ||
| f"{self._scups}: PVPC price update failed: {pvpc_result.get('error', 'Unknown error')}" | ||
| ) | ||
| except Exception as e: | ||
| _LOGGER.warning( | ||
| f"{self._scups}: PVPC price update failed with exception: {str(e)}" | ||
| ) | ||
| # Step 6: Update billing costs if pricing rules are defined | ||
| if self.enable_billing and self.pricing_rules: | ||
| _LOGGER.info(f"{self._scups}: Updating billing costs") | ||
| try: | ||
| billing_result = await self._billing_service.update_missing_costs( | ||
| cups=self._cups, | ||
| pricing_rules=self.pricing_rules, | ||
| start_date=effective_start, | ||
| end_date=effective_end, | ||
| is_ceuta_melilla=False, | ||
| force_recalculate=False, | ||
| ) | ||
| if not billing_result["success"]: | ||
| _LOGGER.warning( | ||
| f"{self._scups}: Billing cost update failed: {billing_result.get('error', 'Unknown error')}" | ||
| ) | ||
| except Exception as e: | ||
| _LOGGER.warning( | ||
| f"{self._scups}: Billing cost update failed with exception: {str(e)}" | ||
| ) | ||
| # Step 7: Calculate summary attributes | ||
| _LOGGER.info(f"{self._scups}: Calculating summary attributes") | ||
| await self._calculate_summary_attributes() | ||
| _LOGGER.info(f"{self._scups}: Update completed successfully") | ||
| return True | ||
| except Exception as e: | ||
| _LOGGER.error(f"{self._scups}: Update failed with exception: {str(e)}") | ||
| return False | ||
| async def _calculate_summary_attributes(self): | ||
| """Calculate summary attributes from all services.""" | ||
| # Reset all attributes | ||
| for attr in ATTRIBUTES: | ||
| self.summary[attr] = None | ||
| try: | ||
| # Get supply summary | ||
| supply_summary = await self._supply_service.get_supply_summary(self._cups) | ||
| self.summary.update(supply_summary) | ||
| # Get contract summary | ||
| contract_summary = await self._contract_service.get_contract_summary( | ||
| self._cups | ||
| ) | ||
| self.summary.update(contract_summary) | ||
| # Get consumption summary | ||
| consumption_summary = ( | ||
| await self._consumption_service.get_consumption_summary(self._cups) | ||
| ) | ||
| self.summary.update(consumption_summary) | ||
| # Get maximeter summary | ||
| maximeter_summary = await self._maximeter_service.get_maximeter_summary( | ||
| self._cups | ||
| ) | ||
| self.summary.update(maximeter_summary) | ||
| # Get billing summary if enabled | ||
| if self.enable_billing and self.pricing_rules and self._billing_service: | ||
| billing_summary = await self._billing_service.get_billing_summary( | ||
| cups=self._cups, | ||
| pricing_rules=self.pricing_rules, | ||
| is_ceuta_melilla=False, | ||
| ) | ||
| self.summary.update(billing_summary) | ||
| # Round numeric values to 2 decimal places for consistency | ||
| for key, value in self.summary.items(): | ||
| if isinstance(value, float): | ||
| self.summary[key] = round(value, 2) | ||
| _LOGGER.debug(f"{self._scups}: Summary attributes calculated successfully") | ||
| except Exception as e: | ||
| _LOGGER.error( | ||
| f"{self._scups}: Error calculating summary attributes: {str(e)}" | ||
| ) |
| """Pydantic models for edata. | ||
| This module contains all data models using Pydantic for robust validation, | ||
| serialization and better developer experience. | ||
| """ | ||
| from edata.models.consumption import Consumption, ConsumptionAggregated | ||
| from edata.models.contract import Contract | ||
| from edata.models.maximeter import MaxPower | ||
| from edata.models.pricing import PricingAggregated, PricingData, PricingRules | ||
| from edata.models.supply import Supply | ||
| __all__ = [ | ||
| "Supply", | ||
| "Contract", | ||
| "Consumption", | ||
| "ConsumptionAggregated", | ||
| "PricingData", | ||
| "PricingRules", | ||
| "PricingAggregated", | ||
| "MaxPower", | ||
| ] |
| """Base models and common functionality for edata Pydantic models.""" | ||
| from datetime import datetime | ||
| from typing import Any, Dict | ||
| from pydantic import BaseModel, ConfigDict, field_validator | ||
| class EdataBaseModel(BaseModel): | ||
| """Base model for all edata entities with common configuration.""" | ||
| model_config = ConfigDict( | ||
| # Validate assignments to ensure data integrity | ||
| validate_assignment=True, | ||
| # Use enum values instead of enum objects for serialization | ||
| use_enum_values=True, | ||
| # Extra fields are forbidden to catch typos and ensure schema compliance | ||
| extra="forbid", | ||
| # Validate default values | ||
| validate_default=True, | ||
| # Allow serialization of datetime objects | ||
| arbitrary_types_allowed=False, | ||
| # Convert strings to datetime objects when possible | ||
| str_strip_whitespace=True, | ||
| ) | ||
| def model_dump_for_storage(self) -> Dict[str, Any]: | ||
| """Serialize model for storage, handling special types like datetime.""" | ||
| return self.model_dump(mode="json") | ||
| @classmethod | ||
| def from_storage(cls, data: Dict[str, Any]): | ||
| """Create model instance from storage data.""" | ||
| return cls.model_validate(data) | ||
| class TimestampMixin(BaseModel): | ||
| """Mixin for models that have datetime fields.""" | ||
| @field_validator("*", mode="before") | ||
| @classmethod | ||
| def validate_datetime_fields(cls, v, info): | ||
| """Convert datetime strings to datetime objects if needed.""" | ||
| field_name = info.field_name | ||
| if field_name and ("datetime" in field_name or "date" in field_name): | ||
| if isinstance(v, str): | ||
| try: | ||
| from dateutil import parser | ||
| return parser.parse(v) | ||
| except (ValueError, TypeError): | ||
| pass | ||
| return v | ||
| class EnergyMixin(BaseModel): | ||
| """Mixin for models dealing with energy values.""" | ||
| @field_validator("*", mode="before") | ||
| @classmethod | ||
| def validate_energy_fields(cls, v, info): | ||
| """Validate energy-related fields.""" | ||
| field_name = info.field_name | ||
| if field_name and ("kwh" in field_name.lower() or "kw" in field_name.lower()): | ||
| if v is not None and v < 0: | ||
| raise ValueError(f"{field_name} cannot be negative") | ||
| return v | ||
| def validate_cups(v: str) -> str: | ||
| """Validate CUPS (Spanish electricity supply point code) format.""" | ||
| if not v: | ||
| raise ValueError("CUPS cannot be empty") | ||
| # Remove spaces and convert to uppercase | ||
| cups = v.replace(" ", "").upper() | ||
| # Basic CUPS format validation (ES + 18-20 alphanumeric characters) | ||
| if not cups.startswith("ES"): | ||
| raise ValueError("CUPS must start with 'ES'") | ||
| if len(cups) < 20 or len(cups) > 22: | ||
| raise ValueError("CUPS must be 20-22 characters long") | ||
| return cups | ||
| def validate_positive_number(v: float) -> float: | ||
| """Validate that a number is positive.""" | ||
| if v is not None and v < 0: | ||
| raise ValueError("Value must be positive") | ||
| return v | ||
| def validate_reasonable_datetime(v: datetime) -> datetime: | ||
| """Validate that datetime is within reasonable bounds.""" | ||
| if v.year < 2000: | ||
| raise ValueError("Date cannot be before year 2000") | ||
| # Allow future dates for contracts and supplies (they can be valid until future dates) | ||
| # Only restrict to really unreasonable future dates | ||
| if v.year > datetime.now().year + 50: | ||
| raise ValueError("Date cannot be more than 50 years in the future") | ||
| return v |
| """Consumption (consumo) related Pydantic models.""" | ||
| from datetime import datetime as dt | ||
| from pydantic import Field, field_validator | ||
| from edata.models.base import ( | ||
| EdataBaseModel, | ||
| EnergyMixin, | ||
| TimestampMixin, | ||
| validate_positive_number, | ||
| validate_reasonable_datetime, | ||
| ) | ||
| class Consumption(EdataBaseModel, TimestampMixin, EnergyMixin): | ||
| """Pydantic model for electricity consumption data.""" | ||
| datetime: dt = Field(..., description="Timestamp of the consumption measurement") | ||
| delta_h: float = Field( | ||
| ..., description="Time interval in hours for this measurement", gt=0, le=24 | ||
| ) | ||
| value_kwh: float = Field(..., description="Energy consumption in kWh", ge=0) | ||
| surplus_kwh: float = Field( | ||
| default=0.0, description="Energy surplus/generation in kWh", ge=0 | ||
| ) | ||
| real: bool = Field( | ||
| default=True, description="Whether this is a real measurement or estimated" | ||
| ) | ||
| @field_validator("datetime") | ||
| @classmethod | ||
| def validate_datetime_range(cls, v: dt) -> dt: | ||
| """Validate datetime is reasonable.""" | ||
| return validate_reasonable_datetime(v) | ||
| @field_validator("value_kwh", "surplus_kwh") | ||
| @classmethod | ||
| def validate_energy_values(cls, v: float) -> float: | ||
| """Validate energy values are positive.""" | ||
| return validate_positive_number(v) | ||
| def __str__(self) -> str: | ||
| """String representation.""" | ||
| return f"Consumption({self.datetime}, {self.value_kwh}kWh)" | ||
| def __repr__(self) -> str: | ||
| """Developer representation.""" | ||
| return f"Consumption(datetime={self.datetime}, value_kwh={self.value_kwh}, real={self.real})" | ||
| class ConsumptionAggregated(EdataBaseModel, TimestampMixin, EnergyMixin): | ||
| """Pydantic model for aggregated consumption data (daily/monthly summaries).""" | ||
| datetime: dt = Field( | ||
| ..., description="Timestamp representing the start of the aggregation period" | ||
| ) | ||
| value_kwh: float = Field( | ||
| ..., description="Total energy consumption in kWh for the period", ge=0 | ||
| ) | ||
| value_p1_kwh: float = Field( | ||
| default=0.0, description="Energy consumption in period P1 (kWh)", ge=0 | ||
| ) | ||
| value_p2_kwh: float = Field( | ||
| default=0.0, description="Energy consumption in period P2 (kWh)", ge=0 | ||
| ) | ||
| value_p3_kwh: float = Field( | ||
| default=0.0, description="Energy consumption in period P3 (kWh)", ge=0 | ||
| ) | ||
| surplus_kwh: float = Field( | ||
| default=0.0, | ||
| description="Total energy surplus/generation in kWh for the period", | ||
| ge=0, | ||
| ) | ||
| surplus_p1_kwh: float = Field( | ||
| default=0.0, description="Energy surplus in period P1 (kWh)", ge=0 | ||
| ) | ||
| surplus_p2_kwh: float = Field( | ||
| default=0.0, description="Energy surplus in period P2 (kWh)", ge=0 | ||
| ) | ||
| surplus_p3_kwh: float = Field( | ||
| default=0.0, description="Energy surplus in period P3 (kWh)", ge=0 | ||
| ) | ||
| delta_h: float = Field( | ||
| ..., description="Duration of the aggregation period in hours", gt=0 | ||
| ) | ||
| @field_validator("datetime") | ||
| @classmethod | ||
| def validate_datetime_range(cls, v: dt) -> dt: | ||
| """Validate datetime is reasonable.""" | ||
| return validate_reasonable_datetime(v) | ||
| def __str__(self) -> str: | ||
| """String representation.""" | ||
| period = "day" if self.delta_h <= 24 else "month" | ||
| return f"ConsumptionAgg({self.datetime.date()}, {self.value_kwh}kWh/{period})" | ||
| def __repr__(self) -> str: | ||
| """Developer representation.""" | ||
| return f"ConsumptionAggregated(datetime={self.datetime}, value_kwh={self.value_kwh}, delta_h={self.delta_h})" |
| """Contract (contrato) related Pydantic models.""" | ||
| from datetime import datetime | ||
| from typing import Optional | ||
| from pydantic import Field, field_validator | ||
| from edata.models.base import ( | ||
| EdataBaseModel, | ||
| TimestampMixin, | ||
| validate_positive_number, | ||
| validate_reasonable_datetime, | ||
| ) | ||
| class Contract(EdataBaseModel, TimestampMixin): | ||
| """Pydantic model for electricity contract data.""" | ||
| date_start: datetime = Field(..., description="Contract start date") | ||
| date_end: datetime = Field(..., description="Contract end date") | ||
| marketer: str = Field(..., description="Energy marketer company name", min_length=1) | ||
| distributor_code: str = Field( | ||
| ..., description="Distributor company code", min_length=1 | ||
| ) | ||
| power_p1: Optional[float] = Field( | ||
| None, description="Contracted power for period P1 (kW)", ge=0 | ||
| ) | ||
| power_p2: Optional[float] = Field( | ||
| None, description="Contracted power for period P2 (kW)", ge=0 | ||
| ) | ||
| @field_validator("date_start", "date_end") | ||
| @classmethod | ||
| def validate_date_range(cls, v: datetime) -> datetime: | ||
| """Validate date is reasonable.""" | ||
| return validate_reasonable_datetime(v) | ||
| @field_validator("power_p1", "power_p2") | ||
| @classmethod | ||
| def validate_power_values(cls, v: Optional[float]) -> Optional[float]: | ||
| """Validate power values are positive.""" | ||
| if v is not None: | ||
| return validate_positive_number(v) | ||
| return v | ||
| def __str__(self) -> str: | ||
| """String representation.""" | ||
| return f"Contract(marketer={self.marketer}, power_p1={self.power_p1}kW)" | ||
| def __repr__(self) -> str: | ||
| """Developer representation.""" | ||
| return f"Contract(marketer={self.marketer}, date_start={self.date_start}, date_end={self.date_end})" |
| from datetime import datetime as DateTime | ||
| from typing import List, Optional | ||
| from pydantic import Field | ||
| from sqlalchemy import UniqueConstraint | ||
| from sqlmodel import Field, Relationship, SQLModel | ||
| from edata.models import Consumption, Contract, MaxPower, PricingData, Supply | ||
| class SupplyModel(Supply, SQLModel, table=True): | ||
| """SQLModel for electricity supply data inheriting from Pydantic model.""" | ||
| __tablename__: str = "supplies" | ||
| # Override cups field to add primary key | ||
| cups: str = Field(primary_key=True, min_length=20, max_length=22) | ||
| # Add database-specific fields | ||
| created_at: DateTime = Field(default_factory=DateTime.now) | ||
| updated_at: DateTime = Field(default_factory=DateTime.now) | ||
| # Relationships | ||
| contracts: List["ContractModel"] = Relationship(back_populates="supply") | ||
| consumptions: List["ConsumptionModel"] = Relationship(back_populates="supply") | ||
| maximeter: List["MaxPowerModel"] = Relationship(back_populates="supply") | ||
| class ContractModel(Contract, SQLModel, table=True): | ||
| """SQLModel for electricity contract data inheriting from Pydantic model.""" | ||
| __tablename__: str = "contracts" | ||
| __table_args__ = (UniqueConstraint("cups", "date_start"),) | ||
| # Add ID field for database | ||
| id: Optional[int] = Field(default=None, primary_key=True) | ||
| # Add CUPS field for foreign key | ||
| cups: str = Field(foreign_key="supplies.cups") | ||
| # Add database-specific fields | ||
| created_at: DateTime = Field(default_factory=DateTime.now) | ||
| updated_at: DateTime = Field(default_factory=DateTime.now) | ||
| # Relationships | ||
| supply: Optional["SupplyModel"] = Relationship(back_populates="contracts") | ||
| class ConsumptionModel(Consumption, SQLModel, table=True): | ||
| """SQLModel for electricity consumption data inheriting from Pydantic model.""" | ||
| __tablename__: str = "consumptions" | ||
| __table_args__ = (UniqueConstraint("cups", "datetime"),) | ||
| # Add ID field for database | ||
| id: Optional[int] = Field(default=None, primary_key=True) | ||
| # Add CUPS field for foreign key | ||
| cups: str = Field(foreign_key="supplies.cups") | ||
| # Add database-specific fields | ||
| created_at: DateTime = Field(default_factory=DateTime.now) | ||
| updated_at: DateTime = Field(default_factory=DateTime.now) | ||
| # Relationships | ||
| supply: Optional["SupplyModel"] = Relationship(back_populates="consumptions") | ||
| class MaxPowerModel(MaxPower, SQLModel, table=True): | ||
| """SQLModel for maximum power demand data inheriting from Pydantic model.""" | ||
| __tablename__: str = "maximeter" | ||
| __table_args__ = (UniqueConstraint("cups", "datetime"),) | ||
| # Add ID field for database | ||
| id: Optional[int] = Field(default=None, primary_key=True) | ||
| # Add CUPS field for foreign key | ||
| cups: str = Field(foreign_key="supplies.cups") | ||
| # Add database-specific fields | ||
| created_at: DateTime = Field(default_factory=DateTime.now) | ||
| updated_at: DateTime = Field(default_factory=DateTime.now) | ||
| # Relationships | ||
| supply: Optional["SupplyModel"] = Relationship(back_populates="maximeter") | ||
| class PVPCPricesModel(PricingData, SQLModel, table=True): | ||
| """SQLModel for PVPC pricing data inheriting from Pydantic model.""" | ||
| __tablename__: str = "pvpc_prices" | ||
| __table_args__ = (UniqueConstraint("datetime", "geo_id"),) | ||
| # Add ID field for database | ||
| id: Optional[int] = Field(default=None, primary_key=True) | ||
| # Add database-specific fields | ||
| created_at: DateTime = Field(default_factory=DateTime.now) | ||
| updated_at: DateTime = Field(default_factory=DateTime.now) | ||
| # Add required fields for geographic specificity | ||
| geo_id: int = Field( | ||
| description="Geographic identifier (8741=Peninsula, 8744=Ceuta/Melilla)" | ||
| ) | ||
| class BillingModel(SQLModel, table=True): | ||
| """SQLModel for billing calculations per hour.""" | ||
| __tablename__: str = "billing" | ||
| __table_args__ = (UniqueConstraint("cups", "datetime", "pricing_config_hash"),) | ||
| # Primary key | ||
| id: Optional[int] = Field(default=None, primary_key=True) | ||
| # Foreign key to supply | ||
| cups: str = Field(foreign_key="supplies.cups") | ||
| datetime: DateTime = Field(description="Hour of the billing calculation") | ||
| # Calculated cost terms (the essential billing data) | ||
| energy_term: float = Field(default=0.0, description="Energy cost term in €") | ||
| power_term: float = Field(default=0.0, description="Power cost term in €") | ||
| others_term: float = Field(default=0.0, description="Other costs term in €") | ||
| surplus_term: float = Field(default=0.0, description="Surplus income term in €") | ||
| total_eur: float = Field(default=0.0, description="Total cost in €") | ||
| # Metadata | ||
| tariff: Optional[str] = Field( | ||
| default=None, description="Tariff period (p1, p2, p3)" | ||
| ) | ||
| pricing_config_hash: str = Field(description="Hash of pricing rules configuration") | ||
| # Audit fields | ||
| created_at: DateTime = Field(default_factory=DateTime.now) | ||
| updated_at: DateTime = Field(default_factory=DateTime.now) |
| """Maximeter (maxímetro) related Pydantic models.""" | ||
| from datetime import datetime as dt | ||
| from pydantic import Field, field_validator | ||
| from edata.models.base import ( | ||
| EdataBaseModel, | ||
| TimestampMixin, | ||
| validate_positive_number, | ||
| validate_reasonable_datetime, | ||
| ) | ||
| class MaxPower(EdataBaseModel, TimestampMixin): | ||
| """Pydantic model for maximum power demand data.""" | ||
| datetime: dt = Field(..., description="Timestamp when maximum power was recorded") | ||
| value_kw: float = Field(..., description="Maximum power demand in kW", ge=0) | ||
| @field_validator("datetime") | ||
| @classmethod | ||
| def validate_datetime_range(cls, v: dt) -> dt: | ||
| """Validate datetime is reasonable.""" | ||
| return validate_reasonable_datetime(v) | ||
| @field_validator("value_kw") | ||
| @classmethod | ||
| def validate_power_value(cls, v: float) -> float: | ||
| """Validate power value is positive.""" | ||
| return validate_positive_number(v) | ||
| def __str__(self) -> str: | ||
| """String representation.""" | ||
| return f"MaxPower({self.datetime}, {self.value_kw}kW)" | ||
| def __repr__(self) -> str: | ||
| """Developer representation.""" | ||
| return f"MaxPower(datetime={self.datetime}, value_kw={self.value_kw})" |
| """Pricing related Pydantic models.""" | ||
| from datetime import datetime as dt | ||
| from typing import Optional | ||
| from pydantic import Field, field_validator | ||
| from edata.models.base import ( | ||
| EdataBaseModel, | ||
| TimestampMixin, | ||
| validate_positive_number, | ||
| validate_reasonable_datetime, | ||
| ) | ||
| class PricingData(EdataBaseModel, TimestampMixin): | ||
| """Pydantic model for electricity pricing data (PVPC prices).""" | ||
| datetime: dt = Field(..., description="Timestamp of the price data") | ||
| value_eur_kwh: float = Field(..., description="Price in EUR per kWh", ge=0) | ||
| delta_h: float = Field( | ||
| default=1.0, description="Duration this price applies (hours)", gt=0, le=24 | ||
| ) | ||
| @field_validator("datetime") | ||
| @classmethod | ||
| def validate_datetime_range(cls, v: dt) -> dt: | ||
| """Validate datetime is reasonable.""" | ||
| return validate_reasonable_datetime(v) | ||
| @field_validator("value_eur_kwh") | ||
| @classmethod | ||
| def validate_price_value(cls, v: float) -> float: | ||
| """Validate price value is positive.""" | ||
| return validate_positive_number(v) | ||
| def __str__(self) -> str: | ||
| """String representation.""" | ||
| return f"Price({self.datetime}, {self.value_eur_kwh:.4f}€/kWh)" | ||
| def __repr__(self) -> str: | ||
| """Developer representation.""" | ||
| return ( | ||
| f"PricingData(datetime={self.datetime}, value_eur_kwh={self.value_eur_kwh})" | ||
| ) | ||
| class PricingRules(EdataBaseModel): | ||
| """Pydantic model for custom pricing rules configuration.""" | ||
| # Power term costs (yearly costs in EUR per kW) | ||
| p1_kw_year_eur: float = Field( | ||
| ..., description="P1 power term cost (EUR/kW/year)", ge=0 | ||
| ) | ||
| p2_kw_year_eur: float = Field( | ||
| ..., description="P2 power term cost (EUR/kW/year)", ge=0 | ||
| ) | ||
| # Energy term costs (optional for fixed pricing) | ||
| p1_kwh_eur: Optional[float] = Field( | ||
| None, description="P1 energy term cost (EUR/kWh) - None for PVPC", ge=0 | ||
| ) | ||
| p2_kwh_eur: Optional[float] = Field( | ||
| None, description="P2 energy term cost (EUR/kWh) - None for PVPC", ge=0 | ||
| ) | ||
| p3_kwh_eur: Optional[float] = Field( | ||
| None, description="P3 energy term cost (EUR/kWh) - None for PVPC", ge=0 | ||
| ) | ||
| # Surplus compensation (optional) | ||
| surplus_p1_kwh_eur: Optional[float] = Field( | ||
| None, description="P1 surplus compensation (EUR/kWh)", ge=0 | ||
| ) | ||
| surplus_p2_kwh_eur: Optional[float] = Field( | ||
| None, description="P2 surplus compensation (EUR/kWh)", ge=0 | ||
| ) | ||
| surplus_p3_kwh_eur: Optional[float] = Field( | ||
| None, description="P3 surplus compensation (EUR/kWh)", ge=0 | ||
| ) | ||
| # Fixed costs | ||
| meter_month_eur: float = Field( | ||
| ..., description="Monthly meter rental cost (EUR/month)", ge=0 | ||
| ) | ||
| market_kw_year_eur: float = Field( | ||
| ..., description="Market operator cost (EUR/kW/year)", ge=0 | ||
| ) | ||
| # Tax multipliers | ||
| electricity_tax: float = Field( | ||
| ..., description="Electricity tax multiplier (e.g., 1.05113 for 5.113%)", ge=1.0 | ||
| ) | ||
| iva_tax: float = Field( | ||
| ..., description="VAT tax multiplier (e.g., 1.21 for 21%)", ge=1.0 | ||
| ) | ||
| # Custom formulas (optional) | ||
| energy_formula: Optional[str] = Field( | ||
| "electricity_tax * iva_tax * kwh_eur * kwh", | ||
| description="Custom energy cost formula (Jinja2 template)", | ||
| ) | ||
| power_formula: Optional[str] = Field( | ||
| "electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| description="Custom power cost formula (Jinja2 template)", | ||
| ) | ||
| others_formula: Optional[str] = Field( | ||
| "iva_tax * meter_month_eur / 30 / 24", | ||
| description="Custom other costs formula (Jinja2 template)", | ||
| ) | ||
| surplus_formula: Optional[str] = Field( | ||
| "electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| description="Custom surplus compensation formula (Jinja2 template)", | ||
| ) | ||
| main_formula: Optional[str] = Field( | ||
| "energy_term + power_term + others_term", | ||
| description="Main cost calculation formula (Jinja2 template)", | ||
| ) | ||
| # Billing cycle | ||
| cycle_start_day: int = Field( | ||
| default=1, description="Day of month when billing cycle starts", ge=1, le=30 | ||
| ) | ||
| @property | ||
| def is_pvpc(self) -> bool: | ||
| """Check if this configuration uses PVPC (variable pricing).""" | ||
| return all( | ||
| price is None | ||
| for price in [self.p1_kwh_eur, self.p2_kwh_eur, self.p3_kwh_eur] | ||
| ) | ||
| def __str__(self) -> str: | ||
| """String representation.""" | ||
| pricing_type = "PVPC" if self.is_pvpc else "Fixed" | ||
| return f"PricingRules({pricing_type}, P1={self.p1_kw_year_eur}€/kW/year)" | ||
| def __repr__(self) -> str: | ||
| """Developer representation.""" | ||
| return f"PricingRules(p1_kw_year_eur={self.p1_kw_year_eur}, is_pvpc={self.is_pvpc})" | ||
| class PricingAggregated(EdataBaseModel, TimestampMixin): | ||
| """Pydantic model for aggregated pricing/billing data.""" | ||
| datetime: dt = Field( | ||
| ..., description="Timestamp representing the start of the billing period" | ||
| ) | ||
| value_eur: float = Field(..., description="Total cost in EUR for the period", ge=0) | ||
| energy_term: float = Field(default=0.0, description="Energy term cost (EUR)", ge=0) | ||
| power_term: float = Field(default=0.0, description="Power term cost (EUR)", ge=0) | ||
| others_term: float = Field(default=0.0, description="Other costs term (EUR)", ge=0) | ||
| surplus_term: float = Field( | ||
| default=0.0, description="Surplus compensation term (EUR)", ge=0 | ||
| ) | ||
| delta_h: float = Field( | ||
| default=1.0, description="Duration of the billing period in hours", gt=0 | ||
| ) | ||
| @field_validator("datetime") | ||
| @classmethod | ||
| def validate_datetime_range(cls, v: dt) -> dt: | ||
| """Validate datetime is reasonable.""" | ||
| return validate_reasonable_datetime(v) | ||
| def __str__(self) -> str: | ||
| """String representation.""" | ||
| period = ( | ||
| "hour" if self.delta_h <= 1 else "day" if self.delta_h <= 24 else "month" | ||
| ) | ||
| return f"Billing({self.datetime.date()}, {self.value_eur:.2f}€/{period})" | ||
| def __repr__(self) -> str: | ||
| """Developer representation.""" | ||
| return f"PricingAggregated(datetime={self.datetime}, value_eur={self.value_eur}, delta_h={self.delta_h})" |
| """Supply (suministro) related Pydantic models.""" | ||
| from datetime import datetime | ||
| from typing import Optional | ||
| from pydantic import Field, field_validator | ||
| from edata.models.base import ( | ||
| EdataBaseModel, | ||
| TimestampMixin, | ||
| validate_cups, | ||
| validate_reasonable_datetime, | ||
| ) | ||
| class Supply(EdataBaseModel, TimestampMixin): | ||
| """Pydantic model for electricity supply data (suministro eléctrico).""" | ||
| cups: str = Field( | ||
| ..., | ||
| description="CUPS (Código Universal de Punto de Suministro) - Universal Supply Point Code", | ||
| min_length=20, | ||
| max_length=22, | ||
| ) | ||
| date_start: datetime = Field(..., description="Supply contract start date") | ||
| date_end: datetime = Field(..., description="Supply contract end date") | ||
| address: Optional[str] = Field(None, description="Supply point address") | ||
| postal_code: Optional[str] = Field( | ||
| None, description="Postal code of the supply point", pattern=r"^\d{5}$" | ||
| ) | ||
| province: Optional[str] = Field(None, description="Province name") | ||
| municipality: Optional[str] = Field(None, description="Municipality name") | ||
| distributor: Optional[str] = Field( | ||
| None, description="Electricity distributor company name" | ||
| ) | ||
| point_type: int = Field(..., description="Type of supply point", ge=1, le=5) | ||
| distributor_code: str = Field( | ||
| ..., description="Distributor company code", min_length=1 | ||
| ) | ||
| @field_validator("cups") | ||
| @classmethod | ||
| def validate_cups_format(cls, v: str) -> str: | ||
| """Validate CUPS format.""" | ||
| return validate_cups(v) | ||
| @field_validator("date_start", "date_end") | ||
| @classmethod | ||
| def validate_date_range(cls, v: datetime) -> datetime: | ||
| """Validate date is reasonable.""" | ||
| return validate_reasonable_datetime(v) | ||
| @field_validator("date_end") | ||
| @classmethod | ||
| def validate_end_after_start(cls, v: datetime, info) -> datetime: | ||
| """Validate that end date is after start date.""" | ||
| if ( | ||
| hasattr(info.data, "date_start") | ||
| and info.data["date_start"] | ||
| and v <= info.data["date_start"] | ||
| ): | ||
| raise ValueError("End date must be after start date") | ||
| return v | ||
| def __str__(self) -> str: | ||
| """String representation showing anonymized CUPS.""" | ||
| return f"Supply(cups=...{self.cups[-5:]}, distributor={self.distributor})" | ||
| def __repr__(self) -> str: | ||
| """Developer representation.""" | ||
| return f"Supply(cups={self.cups}, point_type={self.point_type})" |
| """Scripts de utilidades para edata.""" |
| #!/usr/bin/env python3 | ||
| """ | ||
| Script interactivo para hacer un dump completo de un CUPS a una base de datos. | ||
| """ | ||
| import argparse | ||
| import asyncio | ||
| import getpass | ||
| import logging | ||
| from datetime import datetime, timedelta | ||
| from typing import List, Optional | ||
| from edata.connectors.datadis import DatadisConnector | ||
| from edata.const import DEFAULT_STORAGE_DIR | ||
| from edata.helpers import EdataHelper | ||
| from edata.services.database import SupplyModel as DbSupply | ||
| from edata.services.supply import SupplyService | ||
| # Configure logging | ||
| logging.basicConfig( | ||
| level=logging.INFO, format="%(asctime)s - %(name)s - %(levelname)s - %(message)s" | ||
| ) | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class DumpSupply: | ||
| """Clase para hacer dump interactivo completo de un CUPS.""" | ||
| def __init__(self, storage_dir: Optional[str] = None): | ||
| """Inicializar el dumper interactivo.""" | ||
| self.storage_dir = storage_dir or DEFAULT_STORAGE_DIR | ||
| self.username: Optional[str] = None | ||
| self.password: Optional[str] = None | ||
| self.authorized_nif: Optional[str] = None | ||
| self.connector: Optional[DatadisConnector] = None | ||
| self.supplies: List[DbSupply] = [] | ||
| def get_credentials(self) -> bool: | ||
| """Obtener credenciales del usuario de forma interactiva.""" | ||
| print("\n🔐 Configuración de credenciales Datadis") | ||
| print("=" * 50) | ||
| try: | ||
| self.username = input("📧 Usuario Datadis: ").strip() | ||
| if not self.username: | ||
| print("❌ El usuario es obligatorio") | ||
| return False | ||
| print(self.username) | ||
| self.password = getpass.getpass("🔑 Contraseña Datadis: ").strip() | ||
| if not self.password: | ||
| print("❌ La contraseña es obligatoria") | ||
| return False | ||
| print(self.password) | ||
| nif_input = input( | ||
| "🆔 NIF autorizado (opcional, Enter para omitir): " | ||
| ).strip() | ||
| self.authorized_nif = nif_input if nif_input else None | ||
| return True | ||
| except KeyboardInterrupt: | ||
| print("\n❌ Operación cancelada por el usuario") | ||
| return False | ||
| except Exception as e: | ||
| print(f"❌ Error obteniendo credenciales: {e}") | ||
| return False | ||
| async def test_connection(self) -> bool: | ||
| """Probar la conexión con Datadis.""" | ||
| print("\n🧪 Probando conexión con Datadis...") | ||
| try: | ||
| # Verificar que tenemos credenciales | ||
| if not self.username or not self.password: | ||
| print("❌ Credenciales no disponibles") | ||
| return False | ||
| self.connector = DatadisConnector(self.username, self.password) | ||
| # Probar autenticación | ||
| token_result = await self.connector.login() | ||
| if not token_result: | ||
| print("❌ Error de autenticación. Verifica tus credenciales.") | ||
| return False | ||
| print("✅ Conexión exitosa con Datadis") | ||
| return True | ||
| except Exception as e: | ||
| print(f"❌ Error conectando con Datadis: {e}") | ||
| return False | ||
| async def fetch_supplies(self) -> bool: | ||
| """Obtener y mostrar los suministros disponibles.""" | ||
| print("\n📋 Obteniendo suministros disponibles...") | ||
| try: | ||
| # Verificar que tenemos credenciales | ||
| if not self.username or not self.password: | ||
| print("❌ Credenciales no disponibles") | ||
| return False | ||
| supplies_service = SupplyService( | ||
| DatadisConnector( | ||
| username=self.username, | ||
| password=self.password, | ||
| storage_path=self.storage_dir, | ||
| ), | ||
| storage_dir=self.storage_dir, | ||
| ) | ||
| # Actualizar supplies | ||
| result = await supplies_service.update_supplies( | ||
| authorized_nif=self.authorized_nif | ||
| ) | ||
| if not result["success"]: | ||
| print( | ||
| f"❌ Error obteniendo suministros: {result.get('error', 'Error desconocido')}" | ||
| ) | ||
| return False | ||
| # Obtener todos los supplies desde la base de datos | ||
| self.supplies = await supplies_service.get_supplies() | ||
| if not self.supplies: | ||
| print("❌ No se encontraron suministros en tu cuenta") | ||
| return False | ||
| print(f"✅ Encontrados {len(self.supplies)} suministros") | ||
| return True | ||
| except Exception as e: | ||
| print(f"❌ Error obteniendo suministros: {e}") | ||
| return False | ||
| def display_supplies_menu(self) -> Optional[DbSupply]: | ||
| """Mostrar menú de suministros y obtener selección.""" | ||
| print("\n🏠 Selecciona un suministro para procesar:") | ||
| print("=" * 70) | ||
| for i, supply in enumerate(self.supplies, 1): | ||
| # Mostrar información del suministro | ||
| cups_short = supply.cups[-10:] if len(supply.cups) > 10 else supply.cups | ||
| address = supply.address or "Dirección no disponible" | ||
| if len(address) > 40: | ||
| address = address[:40] + "..." | ||
| print(f"{i:2d}. CUPS: {cups_short} | {address}") | ||
| print( | ||
| f" 📍 {supply.municipality or 'N/A'}, {supply.province or 'N/A'} ({supply.postal_code or 'N/A'})" | ||
| ) | ||
| print( | ||
| f" 📊 Tipo: {supply.point_type} | Distribuidor: {supply.distributor or 'N/A'}" | ||
| ) | ||
| print( | ||
| f" 📅 Válido: {supply.date_start.date()} - {supply.date_end.date()}" | ||
| ) | ||
| print() | ||
| try: | ||
| selection = input( | ||
| f"Selecciona un suministro (1-{len(self.supplies)}) o 'q' para salir: " | ||
| ).strip() | ||
| if selection.lower() == "q": | ||
| return None | ||
| index = int(selection) - 1 | ||
| if 0 <= index < len(self.supplies): | ||
| return self.supplies[index] | ||
| else: | ||
| print( | ||
| f"❌ Selección inválida. Debe estar entre 1 y {len(self.supplies)}" | ||
| ) | ||
| return self.display_supplies_menu() | ||
| except ValueError: | ||
| print("❌ Por favor introduce un número válido") | ||
| return self.display_supplies_menu() | ||
| except KeyboardInterrupt: | ||
| print("\n❌ Operación cancelada") | ||
| return None | ||
| def get_date_range(self) -> tuple[datetime, datetime]: | ||
| """Obtener rango de fechas del usuario.""" | ||
| print("\n📅 Configuración de fechas") | ||
| print("=" * 30) | ||
| print("Deja en blanco para usar valores por defecto (últimos 2 años)") | ||
| try: | ||
| date_from_str = input( | ||
| "📅 Fecha inicio (YYYY-MM-DD) [Enter = 2 años atrás]: " | ||
| ).strip() | ||
| date_to_str = input("📅 Fecha fin (YYYY-MM-DD) [Enter = hoy]: ").strip() | ||
| date_from = None | ||
| date_to = None | ||
| if date_from_str: | ||
| try: | ||
| date_from = datetime.strptime(date_from_str, "%Y-%m-%d") | ||
| except ValueError: | ||
| print( | ||
| "❌ Formato de fecha inicio inválido, usando valor por defecto" | ||
| ) | ||
| if date_to_str: | ||
| try: | ||
| date_to = datetime.strptime(date_to_str, "%Y-%m-%d") | ||
| except ValueError: | ||
| print("❌ Formato de fecha fin inválido, usando valor por defecto") | ||
| # Valores por defecto | ||
| if date_from is None: | ||
| date_from = datetime.now() - timedelta(days=730) | ||
| if date_to is None: | ||
| date_to = datetime.now() | ||
| print(f"📊 Período seleccionado: {date_from.date()} a {date_to.date()}") | ||
| return date_from, date_to | ||
| except KeyboardInterrupt: | ||
| print("❌ Usando valores por defecto") | ||
| default_from = datetime.now() - timedelta(days=730) | ||
| default_to = datetime.now() | ||
| return default_from, default_to | ||
| async def dump_selected_supply( | ||
| self, supply: DbSupply, date_from: datetime, date_to: datetime | ||
| ) -> bool: | ||
| """Hacer dump completo de un suministro seleccionado.""" | ||
| print(f"🚀 Iniciando dump para CUPS {supply.cups[-10:]}") | ||
| print("=" * 50) | ||
| try: | ||
| # Verificar que tenemos credenciales | ||
| if not self.username or not self.password: | ||
| print("❌ Credenciales no disponibles") | ||
| return False | ||
| # Crear EdataHelper para este CUPS | ||
| helper = EdataHelper( | ||
| datadis_username=self.username, | ||
| datadis_password=self.password, | ||
| cups=supply.cups, | ||
| datadis_authorized_nif=self.authorized_nif, | ||
| storage_dir_path=self.storage_dir, | ||
| ) | ||
| print(f"📅 Período: {date_from.date()} a {date_to.date()}") | ||
| print("⏳ Descargando datos... (esto puede tomar varios minutos)") | ||
| # Actualizar todos los datos | ||
| result = await helper.update(date_from=date_from, date_to=date_to) | ||
| if not result: | ||
| print("❌ Error durante la descarga de datos") | ||
| return False | ||
| print("✅ Datos descargados correctamente") | ||
| # Mostrar estadísticas | ||
| await self.display_final_statistics(helper) | ||
| return True | ||
| except Exception as e: | ||
| print(f"❌ Error durante el dump: {e}") | ||
| return False | ||
| async def display_final_statistics(self, helper: EdataHelper): | ||
| """Mostrar estadísticas finales del dump.""" | ||
| print("📊 Estadísticas del dump completado:") | ||
| print("=" * 50) | ||
| summary = helper.attributes | ||
| print(f"🏠 CUPS: {summary.get('cups', 'N/A')}") | ||
| # Información de contrato | ||
| if summary.get("contract_p1_kW") is not None: | ||
| print( | ||
| f"⚡ Potencia contratada P1: {summary.get('contract_p1_kW', 'N/A')} kW" | ||
| ) | ||
| if summary.get("contract_p2_kW") is not None: | ||
| print( | ||
| f"⚡ Potencia contratada P2: {summary.get('contract_p2_kW', 'N/A')} kW" | ||
| ) | ||
| # Información de consumo | ||
| if summary.get("yesterday_kWh") is not None: | ||
| print(f"📈 Consumo ayer: {summary.get('yesterday_kWh', 'N/A')} kWh") | ||
| if summary.get("month_kWh") is not None: | ||
| print(f"📈 Consumo mes actual: {summary.get('month_kWh', 'N/A')} kWh") | ||
| if summary.get("last_month_kWh") is not None: | ||
| print( | ||
| f"📈 Consumo mes anterior: {summary.get('last_month_kWh', 'N/A')} kWh" | ||
| ) | ||
| # Información de potencia máxima | ||
| if summary.get("max_power_kW") is not None: | ||
| print( | ||
| f"🔋 Potencia máxima registrada: {summary.get('max_power_kW', 'N/A')} kW" | ||
| ) | ||
| # Información de costes (si está disponible) | ||
| if summary.get("month_€") is not None: | ||
| print(f"💰 Coste mes actual: {summary.get('month_€', 'N/A')} €") | ||
| if summary.get("last_month_€") is not None: | ||
| print(f"💰 Coste mes anterior: {summary.get('last_month_€', 'N/A')} €") | ||
| print(f"\n💾 Datos almacenados en: {self.storage_dir}") | ||
| async def run_interactive_session(self) -> bool: | ||
| """Ejecutar sesión interactiva completa.""" | ||
| print("🏠 Extractor interactivo de datos eléctricos") | ||
| print("=" * 50) | ||
| print( | ||
| "Este script te ayudará a extraer todos los datos de tu suministro eléctrico" | ||
| ) | ||
| print() | ||
| try: | ||
| # 1. Obtener credenciales | ||
| if not self.get_credentials(): | ||
| return False | ||
| # 2. Probar conexión | ||
| if not await self.test_connection(): | ||
| return False | ||
| # 3. Obtener suministros | ||
| if not await self.fetch_supplies(): | ||
| return False | ||
| # 4. Mostrar menú y seleccionar suministro | ||
| selected_supply = self.display_supplies_menu() | ||
| if not selected_supply: | ||
| print("👋 Operación cancelada") | ||
| return False | ||
| print( | ||
| f"\n✅ Seleccionado: {selected_supply.cups[-10:]} - {selected_supply.address or 'Sin dirección'}" | ||
| ) | ||
| # 5. Configurar fechas | ||
| date_from, date_to = self.get_date_range() | ||
| # 6. Ejecutar dump | ||
| success = await self.dump_selected_supply( | ||
| selected_supply, date_from, date_to | ||
| ) | ||
| if success: | ||
| print("\n🎉 ¡Dump completado exitosamente!") | ||
| print("Todos los datos han sido almacenados en la base de datos local.") | ||
| return success | ||
| except KeyboardInterrupt: | ||
| print("\n\n👋 Operación cancelada por el usuario") | ||
| return False | ||
| except Exception as e: | ||
| print(f"\n❌ Error durante la sesión interactiva: {e}") | ||
| return False | ||
| async def main(): | ||
| """Función principal.""" | ||
| parser = argparse.ArgumentParser( | ||
| description="Extractor interactivo de datos eléctricos", | ||
| formatter_class=argparse.RawDescriptionHelpFormatter, | ||
| epilog=""" | ||
| Ejemplo de uso: | ||
| # Modo interactivo | ||
| python -m edata.scripts.dump | ||
| # Con directorio personalizado | ||
| python -m edata.scripts.dump --storage-dir /ruta/datos | ||
| """, | ||
| ) | ||
| parser.add_argument( | ||
| "--storage-dir", | ||
| default=".", | ||
| help="Directorio de almacenamiento (por defecto: directorio actual)", | ||
| ) | ||
| args = parser.parse_args() | ||
| # Crear dumper | ||
| dumper = DumpSupply(storage_dir=args.storage_dir) | ||
| # Ejecutar modo interactivo | ||
| success = await dumper.run_interactive_session() | ||
| if success: | ||
| exit(0) | ||
| else: | ||
| exit(1) | ||
| if __name__ == "__main__": | ||
| asyncio.run(main()) |
| """Services package for edata.""" | ||
| from edata.services.billing import BillingService | ||
| from edata.services.consumption import ConsumptionService | ||
| from edata.services.contract import ContractService | ||
| from edata.services.database import DatabaseService, get_database_service | ||
| from edata.services.maximeter import MaximeterService | ||
| from edata.services.supply import SupplyService | ||
| __all__ = [ | ||
| "DatabaseService", | ||
| "get_database_service", | ||
| "SupplyService", | ||
| "ContractService", | ||
| "ConsumptionService", | ||
| "MaximeterService", | ||
| "BillingService", | ||
| ] |
| """Billing service for managing energy prices and billing calculations.""" | ||
| import contextlib | ||
| import logging | ||
| from datetime import datetime, timedelta | ||
| from typing import Any, Dict, List, Optional | ||
| from jinja2 import Environment | ||
| from edata.connectors.redata import REDataConnector | ||
| from edata.models.pricing import PricingAggregated, PricingData, PricingRules | ||
| from edata.services.database import PVPCPricesModel, get_database_service | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class BillingService: | ||
| """Service for managing energy pricing and billing data.""" | ||
| def __init__(self, storage_dir: Optional[str] = None): | ||
| """Initialize billing service. | ||
| Args: | ||
| storage_dir: Directory for database storage | ||
| """ | ||
| self._redata = REDataConnector() | ||
| self._storage_dir = storage_dir | ||
| self._db_service = None | ||
| async def _get_db_service(self): | ||
| """Get database service, initializing if needed.""" | ||
| if self._db_service is None: | ||
| self._db_service = await get_database_service(self._storage_dir) | ||
| return self._db_service | ||
| async def update_pvpc_prices( | ||
| self, start_date: datetime, end_date: datetime, is_ceuta_melilla: bool = False | ||
| ) -> Dict[str, Any]: | ||
| """Update PVPC prices from REData API. | ||
| Args: | ||
| start_date: Start date for price data | ||
| end_date: End date for price data | ||
| is_ceuta_melilla: Whether to get prices for Ceuta/Melilla (True) or Peninsula (False) | ||
| Returns: | ||
| Dict with operation results and statistics | ||
| """ | ||
| geo_id = 8744 if is_ceuta_melilla else 8741 | ||
| region = "Ceuta/Melilla" if is_ceuta_melilla else "Peninsula" | ||
| _LOGGER.info( | ||
| f"Updating PVPC prices for {region} from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| # Determine actual start date based on existing data | ||
| actual_start_date = start_date | ||
| db_service = await self._get_db_service() | ||
| last_price_record = await db_service.get_latest_pvpc_price(geo_id=geo_id) | ||
| if last_price_record: | ||
| # Start from the hour after the last price record | ||
| actual_start_date = max( | ||
| start_date, last_price_record.datetime + timedelta(hours=1) | ||
| ) | ||
| _LOGGER.info( | ||
| f"Found existing price data up to {last_price_record.datetime.date()}, fetching from {actual_start_date.date()}" | ||
| ) | ||
| else: | ||
| _LOGGER.info( | ||
| f"No existing price data found for {region}, fetching all data" | ||
| ) | ||
| # If actual start date is beyond end date, no new data needed | ||
| if actual_start_date >= end_date: | ||
| _LOGGER.info(f"No new price data needed for {region} (up to date)") | ||
| return { | ||
| "success": True, | ||
| "region": region, | ||
| "geo_id": geo_id, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": actual_start_date.isoformat(), | ||
| }, | ||
| "stats": { | ||
| "fetched": 0, | ||
| "saved": 0, | ||
| "updated": 0, | ||
| "skipped": "up_to_date", | ||
| }, | ||
| "message": "Price data is up to date", | ||
| } | ||
| try: | ||
| # Fetch price data from REData (only missing data) | ||
| prices = await self._redata.get_realtime_prices( | ||
| dt_from=actual_start_date, | ||
| dt_to=end_date, | ||
| is_ceuta_melilla=is_ceuta_melilla, | ||
| ) | ||
| # Save to database | ||
| saved_count = 0 | ||
| updated_count = 0 | ||
| for price in prices: | ||
| price_dict = price.model_dump() | ||
| price_dict["geo_id"] = geo_id | ||
| # Check if price already exists for this specific datetime and geo_id | ||
| existing = await db_service.get_pvpc_prices( | ||
| start_date=price.datetime, end_date=price.datetime, geo_id=geo_id | ||
| ) | ||
| if existing: | ||
| updated_count += 1 | ||
| else: | ||
| saved_count += 1 | ||
| await db_service.save_pvpc_price(price_dict) | ||
| result = { | ||
| "success": True, | ||
| "region": region, | ||
| "geo_id": geo_id, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": actual_start_date.isoformat(), | ||
| }, | ||
| "stats": { | ||
| "fetched": len(prices), | ||
| "saved": saved_count, | ||
| "updated": updated_count, | ||
| }, | ||
| } | ||
| if actual_start_date > start_date: | ||
| result["message"] = ( | ||
| f"Fetched only missing price data from {actual_start_date.date()}" | ||
| ) | ||
| _LOGGER.info( | ||
| f"PVPC price update completed: {len(prices)} fetched, " | ||
| f"{saved_count} saved, {updated_count} updated" | ||
| ) | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error updating PVPC prices for {region}: {str(e)}") | ||
| return { | ||
| "success": False, | ||
| "region": region, | ||
| "geo_id": geo_id, | ||
| "error": str(e), | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": ( | ||
| actual_start_date.isoformat() | ||
| if "actual_start_date" in locals() | ||
| else start_date.isoformat() | ||
| ), | ||
| }, | ||
| } | ||
| def get_custom_prices( | ||
| self, pricing_rules: PricingRules, start_date: datetime, end_date: datetime | ||
| ) -> List[PricingData]: | ||
| """Calculate custom energy prices dynamically based on pricing rules. | ||
| Args: | ||
| pricing_rules: Custom pricing configuration | ||
| start_date: Start date for price data | ||
| end_date: End date for price data | ||
| Returns: | ||
| List of PricingData objects calculated on-the-fly | ||
| """ | ||
| if pricing_rules.is_pvpc: | ||
| raise ValueError("Use get_stored_pvpc_prices() for PVPC pricing rules") | ||
| _LOGGER.info( | ||
| f"Calculating custom prices from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| try: | ||
| # Import here to avoid circular imports | ||
| from edata.utils import get_pvpc_tariff | ||
| prices = [] | ||
| # Generate hourly prices based on custom rules | ||
| current_dt = start_date | ||
| while current_dt < end_date: | ||
| # Determine tariff period for this hour | ||
| tariff = get_pvpc_tariff(current_dt) | ||
| # Get the appropriate price based on tariff period | ||
| if tariff == "p1" and pricing_rules.p1_kwh_eur is not None: | ||
| price_eur_kwh = pricing_rules.p1_kwh_eur | ||
| elif tariff == "p2" and pricing_rules.p2_kwh_eur is not None: | ||
| price_eur_kwh = pricing_rules.p2_kwh_eur | ||
| elif tariff == "p3" and pricing_rules.p3_kwh_eur is not None: | ||
| price_eur_kwh = pricing_rules.p3_kwh_eur | ||
| else: | ||
| # Skip if no price defined for this period | ||
| current_dt += timedelta(hours=1) | ||
| continue | ||
| # Create PricingData object | ||
| price_data = PricingData( | ||
| datetime=current_dt, value_eur_kwh=price_eur_kwh, delta_h=1.0 | ||
| ) | ||
| prices.append(price_data) | ||
| current_dt += timedelta(hours=1) | ||
| _LOGGER.info(f"Generated {len(prices)} custom price points") | ||
| return prices | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error calculating custom prices: {str(e)}") | ||
| raise | ||
| async def get_stored_pvpc_prices( | ||
| self, | ||
| start_date: Optional[datetime] = None, | ||
| end_date: Optional[datetime] = None, | ||
| geo_id: Optional[int] = None, | ||
| ) -> List[PVPCPricesModel]: | ||
| """Get stored PVPC prices from database. | ||
| Args: | ||
| start_date: Optional start date filter | ||
| end_date: Optional end date filter | ||
| geo_id: Optional geographic filter | ||
| Returns: | ||
| List of PVPCPrices objects | ||
| """ | ||
| db_service = await self._get_db_service() | ||
| return await db_service.get_pvpc_prices(start_date, end_date, geo_id) | ||
| async def get_prices( | ||
| self, | ||
| pricing_rules: PricingRules, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| is_ceuta_melilla: bool = False, | ||
| ) -> Optional[List[PricingData]]: | ||
| """Get prices automatically based on pricing rules configuration. | ||
| Args: | ||
| pricing_rules: Pricing configuration | ||
| start_date: Start date for price data | ||
| end_date: End date for price data | ||
| is_ceuta_melilla: Whether to get PVPC prices for Ceuta/Melilla | ||
| Returns: | ||
| List of PricingData objects or None if missing required data | ||
| """ | ||
| if pricing_rules.is_pvpc: | ||
| # Get stored PVPC prices from database | ||
| geo_id = 8744 if is_ceuta_melilla else 8741 | ||
| pvpc_prices = await self.get_stored_pvpc_prices( | ||
| start_date, end_date, geo_id | ||
| ) | ||
| # Return None if no PVPC prices found | ||
| if not pvpc_prices: | ||
| _LOGGER.warning( | ||
| f"No PVPC prices found for geo_id {geo_id} from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| return None | ||
| # Convert PVPCPrices to PricingData | ||
| return [ | ||
| PricingData( | ||
| datetime=price.datetime, | ||
| value_eur_kwh=price.value_eur_kwh, | ||
| delta_h=price.delta_h, | ||
| ) | ||
| for price in pvpc_prices | ||
| ] | ||
| else: | ||
| # Check if custom pricing rules have required data | ||
| if ( | ||
| pricing_rules.p1_kwh_eur is None | ||
| and pricing_rules.p2_kwh_eur is None | ||
| and pricing_rules.p3_kwh_eur is None | ||
| ): | ||
| _LOGGER.warning("No custom energy prices defined in pricing rules") | ||
| return None | ||
| # Calculate custom prices dynamically | ||
| try: | ||
| custom_prices = self.get_custom_prices( | ||
| pricing_rules, start_date, end_date | ||
| ) | ||
| # Return None if no prices could be generated | ||
| if not custom_prices: | ||
| _LOGGER.warning( | ||
| f"No custom prices could be generated for period {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| return None | ||
| return custom_prices | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error generating custom prices: {str(e)}") | ||
| return None | ||
| async def get_cost( | ||
| self, | ||
| cups: str, | ||
| pricing_rules: PricingRules, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| is_ceuta_melilla: bool = False, | ||
| ) -> PricingAggregated: | ||
| """Get billing cost for a period based on pricing rules. | ||
| First checks the billing table for existing data with the pricing rules hash. | ||
| If not found, calls update_missing_costs to calculate and store the data. | ||
| Then returns the aggregated cost from the billing table. | ||
| Args: | ||
| cups: CUPS identifier for consumption data | ||
| pricing_rules: Pricing configuration | ||
| start_date: Start date for cost calculation | ||
| end_date: End date for cost calculation | ||
| is_ceuta_melilla: Whether to use Ceuta/Melilla PVPC prices | ||
| Returns: | ||
| PricingAggregated object with cost breakdown for the period | ||
| """ | ||
| _LOGGER.info( | ||
| f"Getting cost for CUPS {cups} from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| try: | ||
| # Generate pricing configuration hash | ||
| db_service = await self._get_db_service() | ||
| pricing_config_hash = db_service.generate_pricing_config_hash( | ||
| pricing_rules.model_dump() | ||
| ) | ||
| # Check if billing data already exists by looking for the latest billing record | ||
| latest_billing = await db_service.get_latest_billing( | ||
| cups=cups, pricing_config_hash=pricing_config_hash | ||
| ) | ||
| # Determine if we need to calculate missing costs | ||
| needs_calculation = False | ||
| actual_start_date = start_date | ||
| if not latest_billing: | ||
| # No billing data exists for this configuration | ||
| needs_calculation = True | ||
| _LOGGER.info( | ||
| f"No billing data found for hash {pricing_config_hash[:8]}..., calculating all costs" | ||
| ) | ||
| elif latest_billing.datetime < end_date - timedelta(hours=1): | ||
| # Billing data exists but is incomplete for the requested period | ||
| needs_calculation = True | ||
| actual_start_date = max( | ||
| start_date, latest_billing.datetime + timedelta(hours=1) | ||
| ) | ||
| _LOGGER.info( | ||
| f"Found billing data up to {latest_billing.datetime.date()}, calculating from {actual_start_date.date()}" | ||
| ) | ||
| # Calculate missing costs if needed | ||
| if needs_calculation: | ||
| update_result = await self.update_missing_costs( | ||
| cups, | ||
| pricing_rules, | ||
| actual_start_date, | ||
| end_date, | ||
| is_ceuta_melilla, | ||
| force_recalculate=False, | ||
| ) | ||
| if not update_result["success"]: | ||
| _LOGGER.error( | ||
| f"Failed to update costs: {update_result.get('error', 'Unknown error')}" | ||
| ) | ||
| return PricingAggregated( | ||
| datetime=start_date, | ||
| value_eur=0.0, | ||
| energy_term=0.0, | ||
| power_term=0.0, | ||
| others_term=0.0, | ||
| surplus_term=0.0, | ||
| delta_h=(end_date - start_date).total_seconds() / 3600, | ||
| ) | ||
| # Get the complete billing data for the requested period | ||
| existing_billing = await db_service.get_billing( | ||
| cups=cups, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| pricing_config_hash=pricing_config_hash, | ||
| ) | ||
| # Aggregate the billing data | ||
| total_value_eur = 0.0 | ||
| total_energy_term = 0.0 | ||
| total_power_term = 0.0 | ||
| total_others_term = 0.0 | ||
| total_surplus_term = 0.0 | ||
| total_hours = len(existing_billing) | ||
| for billing in existing_billing: | ||
| total_value_eur += billing.total_eur or 0.0 | ||
| total_energy_term += billing.energy_term or 0.0 | ||
| total_power_term += billing.power_term or 0.0 | ||
| total_others_term += billing.others_term or 0.0 | ||
| total_surplus_term += billing.surplus_term or 0.0 | ||
| result = PricingAggregated( | ||
| datetime=start_date, | ||
| value_eur=round(total_value_eur, 6), | ||
| energy_term=round(total_energy_term, 6), | ||
| power_term=round(total_power_term, 6), | ||
| others_term=round(total_others_term, 6), | ||
| surplus_term=round(total_surplus_term, 6), | ||
| delta_h=total_hours, | ||
| ) | ||
| _LOGGER.info( | ||
| f"Cost calculation completed for CUPS {cups}: " | ||
| f"€{total_value_eur:.2f} for {total_hours} hours" | ||
| ) | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting cost for CUPS {cups}: {str(e)}") | ||
| raise | ||
| async def update_missing_costs( | ||
| self, | ||
| cups: str, | ||
| pricing_rules: PricingRules, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| is_ceuta_melilla: bool = False, | ||
| force_recalculate: bool = False, | ||
| ) -> Dict[str, Any]: | ||
| """Calculate and store billing costs in the database. | ||
| Args: | ||
| cups: CUPS identifier for consumption data | ||
| pricing_rules: Pricing configuration | ||
| start_date: Start date for cost calculation | ||
| end_date: End date for cost calculation | ||
| is_ceuta_melilla: Whether to use Ceuta/Melilla PVPC prices | ||
| force_recalculate: If True, recalculate even if billing data exists | ||
| Returns: | ||
| Dict with operation results and statistics | ||
| """ | ||
| _LOGGER.info( | ||
| f"Updating costs for CUPS {cups} from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| try: | ||
| # Generate pricing configuration hash | ||
| db_service = await self._get_db_service() | ||
| pricing_config_hash = db_service.generate_pricing_config_hash( | ||
| pricing_rules.model_dump() | ||
| ) | ||
| # Get existing billing data if not forcing recalculation | ||
| existing_billing = [] | ||
| if not force_recalculate: | ||
| existing_billing = await db_service.get_billing( | ||
| cups=cups, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| pricing_config_hash=pricing_config_hash, | ||
| ) | ||
| # Create set of existing datetime for quick lookup | ||
| existing_hours = {billing.datetime for billing in existing_billing} | ||
| # Get consumption data | ||
| consumptions = await db_service.get_consumptions(cups, start_date, end_date) | ||
| if not consumptions: | ||
| _LOGGER.warning( | ||
| f"No consumption data found for CUPS {cups} in the specified period" | ||
| ) | ||
| return { | ||
| "success": False, | ||
| "error": "No consumption data found", | ||
| "cups": cups, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| }, | ||
| } | ||
| # Get contract data for power terms | ||
| contracts = await db_service.get_contracts(cups) | ||
| if not contracts: | ||
| _LOGGER.warning( | ||
| f"No contract data found for CUPS {cups}, using defaults" | ||
| ) | ||
| # Use default power values if no contracts found | ||
| default_contract = { | ||
| "power_p1": 3.45, # Default residential power | ||
| "power_p2": 3.45, | ||
| "date_start": start_date, | ||
| "date_end": end_date, | ||
| } | ||
| contracts = [type("MockContract", (), default_contract)()] | ||
| # Get pricing data | ||
| prices = await self.get_prices( | ||
| pricing_rules, start_date, end_date, is_ceuta_melilla | ||
| ) | ||
| if prices is None: | ||
| _LOGGER.warning( | ||
| f"No pricing data available for CUPS {cups} in the specified period" | ||
| ) | ||
| return { | ||
| "success": False, | ||
| "error": "No pricing data available", | ||
| "cups": cups, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| }, | ||
| } | ||
| # Create price lookup by datetime | ||
| price_lookup = {price.datetime: price.value_eur_kwh for price in prices} | ||
| # Build data structure similar to billing processor | ||
| data = {} | ||
| for consumption in consumptions: | ||
| data[consumption.datetime] = { | ||
| "datetime": consumption.datetime, | ||
| "kwh": consumption.value_kwh, | ||
| "surplus_kwh": ( | ||
| consumption.surplus_kwh | ||
| if hasattr(consumption, "surplus_kwh") | ||
| and consumption.surplus_kwh is not None | ||
| else 0 | ||
| ), | ||
| } | ||
| # Add contract power data | ||
| for contract in contracts: | ||
| start_dt = getattr(contract, "date_start", start_date) | ||
| end_dt = getattr(contract, "date_end", end_date) | ||
| current = start_dt | ||
| while current <= end_dt and current <= end_date: | ||
| if current in data: | ||
| data[current]["p1_kw"] = getattr(contract, "power_p1", 3.45) | ||
| data[current]["p2_kw"] = getattr(contract, "power_p2", 3.45) | ||
| current += timedelta(hours=1) | ||
| # Add pricing data | ||
| for dt, kwh_eur in price_lookup.items(): | ||
| if dt in data: | ||
| data[dt]["kwh_eur"] = kwh_eur | ||
| # Prepare Jinja2 expressions for cost calculation | ||
| env = Environment() | ||
| energy_expr = env.compile_expression( | ||
| f"({pricing_rules.energy_formula})|float" | ||
| ) | ||
| power_expr = env.compile_expression( | ||
| f"({pricing_rules.power_formula})|float" | ||
| ) | ||
| others_expr = env.compile_expression( | ||
| f"({pricing_rules.others_formula})|float" | ||
| ) | ||
| surplus_expr = env.compile_expression( | ||
| f"({pricing_rules.surplus_formula})|float" | ||
| ) | ||
| main_expr = env.compile_expression(f"({pricing_rules.main_formula})|float") | ||
| # Calculate and save costs for each hour | ||
| saved_count = 0 | ||
| updated_count = 0 | ||
| skipped_count = 0 | ||
| for dt in sorted(data.keys()): | ||
| # Skip if already exists and not forcing recalculation | ||
| if not force_recalculate and dt in existing_hours: | ||
| skipped_count += 1 | ||
| continue | ||
| hour_data = data[dt] | ||
| # Add pricing rules to hour data | ||
| hour_data.update(pricing_rules.model_dump()) | ||
| # Import here to avoid circular imports | ||
| from edata.utils import get_pvpc_tariff | ||
| tariff = get_pvpc_tariff(hour_data["datetime"]) | ||
| # Set energy price if not already set | ||
| if "kwh_eur" not in hour_data: | ||
| if tariff == "p1" and pricing_rules.p1_kwh_eur is not None: | ||
| hour_data["kwh_eur"] = pricing_rules.p1_kwh_eur | ||
| elif tariff == "p2" and pricing_rules.p2_kwh_eur is not None: | ||
| hour_data["kwh_eur"] = pricing_rules.p2_kwh_eur | ||
| elif tariff == "p3" and pricing_rules.p3_kwh_eur is not None: | ||
| hour_data["kwh_eur"] = pricing_rules.p3_kwh_eur | ||
| else: | ||
| continue # Skip if no price available | ||
| # Set surplus price based on tariff | ||
| if tariff == "p1": | ||
| hour_data["surplus_kwh_eur"] = pricing_rules.surplus_p1_kwh_eur or 0 | ||
| elif tariff == "p2": | ||
| hour_data["surplus_kwh_eur"] = pricing_rules.surplus_p2_kwh_eur or 0 | ||
| elif tariff == "p3": | ||
| hour_data["surplus_kwh_eur"] = pricing_rules.surplus_p3_kwh_eur or 0 | ||
| # Calculate individual cost terms | ||
| energy_term = 0.0 | ||
| power_term = 0.0 | ||
| others_term = 0.0 | ||
| surplus_term = 0.0 | ||
| with contextlib.suppress(Exception): | ||
| result = energy_expr(**hour_data) | ||
| energy_term = round(float(result), 6) if result is not None else 0.0 | ||
| result = power_expr(**hour_data) | ||
| power_term = round(float(result), 6) if result is not None else 0.0 | ||
| result = others_expr(**hour_data) | ||
| others_term = round(float(result), 6) if result is not None else 0.0 | ||
| result = surplus_expr(**hour_data) | ||
| surplus_term = ( | ||
| round(float(result), 6) if result is not None else 0.0 | ||
| ) | ||
| # Calculate total using main formula | ||
| cost_data = { | ||
| "energy_term": energy_term, | ||
| "power_term": power_term, | ||
| "others_term": others_term, | ||
| "surplus_term": surplus_term, | ||
| **pricing_rules.model_dump(), | ||
| } | ||
| total_eur = 0.0 | ||
| with contextlib.suppress(Exception): | ||
| result = main_expr(**cost_data) | ||
| total_eur = round(float(result), 6) if result is not None else 0.0 | ||
| # Prepare billing data (only calculated terms, not raw data) | ||
| billing_data = { | ||
| "cups": cups, | ||
| "datetime": dt, | ||
| "energy_term": energy_term, | ||
| "power_term": power_term, | ||
| "others_term": others_term, | ||
| "surplus_term": surplus_term, | ||
| "total_eur": total_eur, | ||
| "tariff": tariff, | ||
| "pricing_config_hash": pricing_config_hash, | ||
| } | ||
| # Save to database | ||
| await db_service.save_billing(billing_data) | ||
| if dt in existing_hours: | ||
| updated_count += 1 | ||
| else: | ||
| saved_count += 1 | ||
| result = { | ||
| "success": True, | ||
| "cups": cups, | ||
| "pricing_config_hash": pricing_config_hash, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| }, | ||
| "stats": { | ||
| "total_consumptions": len(consumptions), | ||
| "saved": saved_count, | ||
| "updated": updated_count, | ||
| "skipped": skipped_count, | ||
| "processed": saved_count + updated_count, | ||
| }, | ||
| } | ||
| _LOGGER.info( | ||
| f"Billing cost update completed for CUPS {cups}: " | ||
| f"{saved_count} saved, {updated_count} updated, {skipped_count} skipped" | ||
| ) | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error updating costs for CUPS {cups}: {str(e)}") | ||
| return { | ||
| "success": False, | ||
| "error": str(e), | ||
| "cups": cups, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| }, | ||
| } | ||
| async def get_daily_costs( | ||
| self, | ||
| cups: str, | ||
| pricing_rules: PricingRules, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| is_ceuta_melilla: bool = False, | ||
| ) -> List[PricingAggregated]: | ||
| """Get daily aggregated billing costs for a period. | ||
| Args: | ||
| cups: CUPS identifier for consumption data | ||
| pricing_rules: Pricing configuration | ||
| start_date: Start date for cost calculation | ||
| end_date: End date for cost calculation | ||
| is_ceuta_melilla: Whether to use Ceuta/Melilla PVPC prices | ||
| Returns: | ||
| List of PricingAggregated objects, one per day | ||
| """ | ||
| _LOGGER.info( | ||
| f"Getting daily costs for CUPS {cups} from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| try: | ||
| # Generate pricing configuration hash | ||
| db_service = await self._get_db_service() | ||
| pricing_config_hash = db_service.generate_pricing_config_hash( | ||
| pricing_rules.model_dump() | ||
| ) | ||
| # Get billing data for the period | ||
| billing_records = await db_service.get_billing( | ||
| cups=cups, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| pricing_config_hash=pricing_config_hash, | ||
| ) | ||
| # If no billing data exists, calculate and store it first | ||
| if not billing_records: | ||
| _LOGGER.info(f"No billing data found, calculating costs first") | ||
| update_result = await self.update_missing_costs( | ||
| cups, | ||
| pricing_rules, | ||
| start_date, | ||
| end_date, | ||
| is_ceuta_melilla, | ||
| force_recalculate=False, | ||
| ) | ||
| if not update_result["success"]: | ||
| _LOGGER.error( | ||
| f"Failed to update costs: {update_result.get('error', 'Unknown error')}" | ||
| ) | ||
| return [] | ||
| # Get the newly calculated billing data | ||
| billing_records = await db_service.get_billing( | ||
| cups=cups, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| pricing_config_hash=pricing_config_hash, | ||
| ) | ||
| # Group by day and aggregate | ||
| daily_aggregates = {} | ||
| for billing in billing_records: | ||
| # Get the date (without time) as key | ||
| date_key = billing.datetime.date() | ||
| if date_key not in daily_aggregates: | ||
| daily_aggregates[date_key] = { | ||
| "datetime": datetime.combine(date_key, datetime.min.time()), | ||
| "total_eur": 0.0, | ||
| "energy_term": 0.0, | ||
| "power_term": 0.0, | ||
| "others_term": 0.0, | ||
| "surplus_term": 0.0, | ||
| "hours": 0, | ||
| } | ||
| # Add this hour's costs | ||
| daily_aggregates[date_key]["total_eur"] += billing.total_eur or 0.0 | ||
| daily_aggregates[date_key]["energy_term"] += billing.energy_term or 0.0 | ||
| daily_aggregates[date_key]["power_term"] += billing.power_term or 0.0 | ||
| daily_aggregates[date_key]["others_term"] += billing.others_term or 0.0 | ||
| daily_aggregates[date_key]["surplus_term"] += ( | ||
| billing.surplus_term or 0.0 | ||
| ) | ||
| daily_aggregates[date_key]["hours"] += 1 | ||
| # Convert to PricingAggregated objects | ||
| result = [] | ||
| for date_key in sorted(daily_aggregates.keys()): | ||
| agg = daily_aggregates[date_key] | ||
| pricing_agg = PricingAggregated( | ||
| datetime=agg["datetime"], | ||
| value_eur=round(agg["total_eur"], 6), | ||
| energy_term=round(agg["energy_term"], 6), | ||
| power_term=round(agg["power_term"], 6), | ||
| others_term=round(agg["others_term"], 6), | ||
| surplus_term=round(agg["surplus_term"], 6), | ||
| delta_h=agg["hours"], | ||
| ) | ||
| result.append(pricing_agg) | ||
| _LOGGER.info(f"Generated {len(result)} daily cost aggregates") | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting daily costs for CUPS {cups}: {str(e)}") | ||
| raise | ||
| async def get_monthly_costs( | ||
| self, | ||
| cups: str, | ||
| pricing_rules: PricingRules, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| is_ceuta_melilla: bool = False, | ||
| ) -> List[PricingAggregated]: | ||
| """Get monthly aggregated billing costs for a period. | ||
| Args: | ||
| cups: CUPS identifier for consumption data | ||
| pricing_rules: Pricing configuration | ||
| start_date: Start date for cost calculation | ||
| end_date: End date for cost calculation | ||
| is_ceuta_melilla: Whether to use Ceuta/Melilla PVPC prices | ||
| Returns: | ||
| List of PricingAggregated objects, one per month | ||
| """ | ||
| _LOGGER.info( | ||
| f"Getting monthly costs for CUPS {cups} from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| try: | ||
| # Generate pricing configuration hash | ||
| db_service = await self._get_db_service() | ||
| pricing_config_hash = db_service.generate_pricing_config_hash( | ||
| pricing_rules.model_dump() | ||
| ) | ||
| # Get billing data for the period | ||
| billing_records = await db_service.get_billing( | ||
| cups=cups, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| pricing_config_hash=pricing_config_hash, | ||
| ) | ||
| # If no billing data exists, calculate and store it first | ||
| if not billing_records: | ||
| _LOGGER.info(f"No billing data found, calculating costs first") | ||
| update_result = await self.update_missing_costs( | ||
| cups, | ||
| pricing_rules, | ||
| start_date, | ||
| end_date, | ||
| is_ceuta_melilla, | ||
| force_recalculate=False, | ||
| ) | ||
| if not update_result["success"]: | ||
| _LOGGER.error( | ||
| f"Failed to update costs: {update_result.get('error', 'Unknown error')}" | ||
| ) | ||
| return [] | ||
| # Get the newly calculated billing data | ||
| billing_records = await db_service.get_billing( | ||
| cups=cups, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| pricing_config_hash=pricing_config_hash, | ||
| ) | ||
| # Group by month and aggregate | ||
| monthly_aggregates = {} | ||
| for billing in billing_records: | ||
| # Get year-month as key | ||
| month_key = (billing.datetime.year, billing.datetime.month) | ||
| if month_key not in monthly_aggregates: | ||
| # Create datetime for first day of month | ||
| month_start = datetime(month_key[0], month_key[1], 1) | ||
| monthly_aggregates[month_key] = { | ||
| "datetime": month_start, | ||
| "total_eur": 0.0, | ||
| "energy_term": 0.0, | ||
| "power_term": 0.0, | ||
| "others_term": 0.0, | ||
| "surplus_term": 0.0, | ||
| "hours": 0, | ||
| } | ||
| # Add this hour's costs | ||
| monthly_aggregates[month_key]["total_eur"] += billing.total_eur or 0.0 | ||
| monthly_aggregates[month_key]["energy_term"] += ( | ||
| billing.energy_term or 0.0 | ||
| ) | ||
| monthly_aggregates[month_key]["power_term"] += billing.power_term or 0.0 | ||
| monthly_aggregates[month_key]["others_term"] += ( | ||
| billing.others_term or 0.0 | ||
| ) | ||
| monthly_aggregates[month_key]["surplus_term"] += ( | ||
| billing.surplus_term or 0.0 | ||
| ) | ||
| monthly_aggregates[month_key]["hours"] += 1 | ||
| # Convert to PricingAggregated objects | ||
| result = [] | ||
| for month_key in sorted(monthly_aggregates.keys()): | ||
| agg = monthly_aggregates[month_key] | ||
| pricing_agg = PricingAggregated( | ||
| datetime=agg["datetime"], | ||
| value_eur=round(agg["total_eur"], 6), | ||
| energy_term=round(agg["energy_term"], 6), | ||
| power_term=round(agg["power_term"], 6), | ||
| others_term=round(agg["others_term"], 6), | ||
| surplus_term=round(agg["surplus_term"], 6), | ||
| delta_h=agg["hours"], | ||
| ) | ||
| result.append(pricing_agg) | ||
| _LOGGER.info(f"Generated {len(result)} monthly cost aggregates") | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting monthly costs for CUPS {cups}: {str(e)}") | ||
| raise | ||
| async def get_billing_summary( | ||
| self, | ||
| cups: str, | ||
| pricing_rules: PricingRules, | ||
| target_date: Optional[datetime] = None, | ||
| is_ceuta_melilla: bool = False, | ||
| ) -> Dict[str, Any]: | ||
| """Get billing summary data compatible with EdataHelper attributes. | ||
| Args: | ||
| cups: CUPS identifier | ||
| pricing_rules: Pricing configuration | ||
| target_date: Reference date for calculations (defaults to today) | ||
| is_ceuta_melilla: Whether to use Ceuta/Melilla PVPC prices | ||
| Returns: | ||
| Dict with summary attributes matching EdataHelper format | ||
| """ | ||
| from dateutil.relativedelta import relativedelta | ||
| if target_date is None: | ||
| target_date = datetime.now() | ||
| # Calculate date ranges | ||
| month_starts = target_date.replace( | ||
| day=1, hour=0, minute=0, second=0, microsecond=0 | ||
| ) | ||
| last_month_starts = month_starts - relativedelta(months=1) | ||
| # Initialize summary attributes | ||
| summary: Dict[str, Any] = {"month_€": None, "last_month_€": None} | ||
| try: | ||
| # Get current month cost | ||
| current_month_costs = await self.get_monthly_costs( | ||
| cups=cups, | ||
| pricing_rules=pricing_rules, | ||
| start_date=month_starts, | ||
| end_date=month_starts + relativedelta(months=1), | ||
| is_ceuta_melilla=is_ceuta_melilla, | ||
| ) | ||
| if current_month_costs: | ||
| current_month_data = next( | ||
| ( | ||
| c | ||
| for c in current_month_costs | ||
| if c.datetime.year == month_starts.year | ||
| and c.datetime.month == month_starts.month | ||
| ), | ||
| None, | ||
| ) | ||
| if current_month_data: | ||
| summary["month_€"] = current_month_data.value_eur | ||
| # Get last month cost | ||
| last_month_costs = await self.get_monthly_costs( | ||
| cups=cups, | ||
| pricing_rules=pricing_rules, | ||
| start_date=last_month_starts, | ||
| end_date=month_starts, | ||
| is_ceuta_melilla=is_ceuta_melilla, | ||
| ) | ||
| if last_month_costs: | ||
| last_month_data = next( | ||
| ( | ||
| c | ||
| for c in last_month_costs | ||
| if c.datetime.year == last_month_starts.year | ||
| and c.datetime.month == last_month_starts.month | ||
| ), | ||
| None, | ||
| ) | ||
| if last_month_data: | ||
| summary["last_month_€"] = last_month_data.value_eur | ||
| except Exception as e: | ||
| _LOGGER.warning( | ||
| f"Error calculating billing summary for CUPS {cups}: {str(e)}" | ||
| ) | ||
| # Round numeric values to 2 decimal places | ||
| for key, value in summary.items(): | ||
| if isinstance(value, float): | ||
| summary[key] = round(value, 2) | ||
| return summary |
| """Consumption service for fetching and updating consumption data.""" | ||
| import logging | ||
| from datetime import datetime, timedelta | ||
| from typing import Any, Dict, List, Optional | ||
| from edata.connectors.datadis import DatadisConnector | ||
| from edata.models.consumption import Consumption, ConsumptionAggregated | ||
| from edata.services.database import ConsumptionModel as DbConsumption | ||
| from edata.services.database import DatabaseService, get_database_service | ||
| from edata.utils import get_pvpc_tariff | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class ConsumptionService: | ||
| """Service for managing consumption data fetching and storage.""" | ||
| def __init__( | ||
| self, | ||
| datadis_connector: DatadisConnector, | ||
| storage_dir: Optional[str] = None, | ||
| ): | ||
| """Initialize consumption service. | ||
| Args: | ||
| datadis_connector: Configured Datadis connector instance | ||
| storage_dir: Directory for database and cache storage | ||
| """ | ||
| self._datadis = datadis_connector | ||
| self._storage_dir = storage_dir | ||
| self._db_service = None | ||
| async def _get_db_service(self) -> DatabaseService: | ||
| """Get database service, initializing if needed.""" | ||
| if self._db_service is None: | ||
| self._db_service = await get_database_service(self._storage_dir) | ||
| return self._db_service | ||
| async def update_consumptions( | ||
| self, | ||
| cups: str, | ||
| distributor_code: str, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| measurement_type: str = "0", | ||
| point_type: int = 5, | ||
| authorized_nif: Optional[str] = None, | ||
| force_full_update: bool = False, | ||
| ) -> Dict[str, Any]: | ||
| """Update consumption data for a CUPS in the specified date range. | ||
| Args: | ||
| cups: CUPS identifier | ||
| distributor_code: Distributor company code | ||
| start_date: Start date for consumption data | ||
| end_date: End date for consumption data | ||
| measurement_type: Type of measurement (default "0" for hourly) | ||
| point_type: Type of supply point (default 5) | ||
| authorized_nif: Authorized NIF if accessing on behalf of someone | ||
| force_full_update: If True, fetch all data ignoring existing records | ||
| Returns: | ||
| Dict with operation results and statistics | ||
| """ | ||
| _LOGGER.info( | ||
| f"Updating consumptions for CUPS {cups[-5:]:>5} from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| # Determine actual start date based on existing data | ||
| actual_start_date = start_date | ||
| if not force_full_update: | ||
| last_consumption_date = await self.get_last_consumption_date(cups) | ||
| if last_consumption_date: | ||
| # Start from the day after the last consumption | ||
| actual_start_date = max( | ||
| start_date, last_consumption_date + timedelta(hours=1) | ||
| ) | ||
| _LOGGER.info( | ||
| f"Found existing data up to {last_consumption_date.date()}, fetching from {actual_start_date.date()}" | ||
| ) | ||
| else: | ||
| _LOGGER.info( | ||
| f"No existing consumption data found for CUPS {cups[-5:]:>5}, fetching all data" | ||
| ) | ||
| # If actual start date is beyond end date, no new data needed | ||
| if actual_start_date >= end_date: | ||
| _LOGGER.info( | ||
| f"No new consumption data needed for CUPS {cups[-5:]:>5} (up to date)" | ||
| ) | ||
| return { | ||
| "success": True, | ||
| "cups": cups, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": actual_start_date.isoformat(), | ||
| }, | ||
| "stats": { | ||
| "fetched": 0, | ||
| "saved": 0, | ||
| "updated": 0, | ||
| "skipped": "up_to_date", | ||
| }, | ||
| "message": "Data is up to date", | ||
| } | ||
| try: | ||
| # Fetch consumption data from datadis (only missing data) | ||
| consumptions = await self._datadis.get_consumption_data( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=actual_start_date, | ||
| end_date=end_date, | ||
| measurement_type=measurement_type, | ||
| point_type=point_type, | ||
| authorized_nif=authorized_nif, | ||
| ) | ||
| # Save to database | ||
| saved_count = 0 | ||
| updated_count = 0 | ||
| for consumption in consumptions: | ||
| # Convert Pydantic model to dict and add CUPS | ||
| consumption_dict = consumption.model_dump() | ||
| consumption_dict["cups"] = cups | ||
| # Check if consumption already exists | ||
| db_service = await self._get_db_service() | ||
| existing = await db_service.get_consumptions( | ||
| cups=cups, | ||
| start_date=consumption.datetime, | ||
| end_date=consumption.datetime, | ||
| ) | ||
| if existing: | ||
| updated_count += 1 | ||
| else: | ||
| saved_count += 1 | ||
| await db_service.save_consumption(consumption_dict) | ||
| result = { | ||
| "success": True, | ||
| "cups": cups, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": actual_start_date.isoformat(), | ||
| }, | ||
| "stats": { | ||
| "fetched": len(consumptions), | ||
| "saved": saved_count, | ||
| "updated": updated_count, | ||
| }, | ||
| } | ||
| if actual_start_date > start_date: | ||
| result["message"] = ( | ||
| f"Fetched only missing data from {actual_start_date.date()}" | ||
| ) | ||
| _LOGGER.info( | ||
| f"Consumption update completed: {len(consumptions)} fetched, " | ||
| f"{saved_count} saved, {updated_count} updated" | ||
| ) | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error updating consumptions for CUPS {cups}: {str(e)}") | ||
| return { | ||
| "success": False, | ||
| "cups": cups, | ||
| "error": str(e), | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": ( | ||
| actual_start_date.isoformat() | ||
| if "actual_start_date" in locals() | ||
| else start_date.isoformat() | ||
| ), | ||
| }, | ||
| } | ||
| async def update_consumption_range_by_months( | ||
| self, | ||
| cups: str, | ||
| distributor_code: str, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| measurement_type: str = "0", | ||
| point_type: int = 5, | ||
| authorized_nif: Optional[str] = None, | ||
| force_full_update: bool = False, | ||
| ) -> Dict[str, Any]: | ||
| """Update consumption data month by month to respect datadis limits. | ||
| Args: | ||
| cups: CUPS identifier | ||
| distributor_code: Distributor company code | ||
| start_date: Start date for consumption data | ||
| end_date: End date for consumption data | ||
| measurement_type: Type of measurement (default "0" for hourly) | ||
| point_type: Type of supply point (default 5) | ||
| authorized_nif: Authorized NIF if accessing on behalf of someone | ||
| force_full_update: If True, fetch all data ignoring existing records | ||
| Returns: | ||
| Dict with operation results and statistics for all months | ||
| """ | ||
| _LOGGER.info( | ||
| f"Updating consumption range for CUPS {cups[-5:]:>5} " | ||
| f"from {start_date.date()} to {end_date.date()} by months" | ||
| ) | ||
| results = [] | ||
| current_date = start_date | ||
| while current_date < end_date: | ||
| # Calculate month end | ||
| if current_date.month == 12: | ||
| month_end = current_date.replace( | ||
| year=current_date.year + 1, month=1, day=1 | ||
| ) | ||
| else: | ||
| month_end = current_date.replace(month=current_date.month + 1, day=1) | ||
| # Don't go past the requested end date | ||
| actual_end = min(month_end, end_date) | ||
| # Update consumptions for this month | ||
| consumption_result = await self.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=current_date, | ||
| end_date=actual_end, | ||
| measurement_type=measurement_type, | ||
| point_type=point_type, | ||
| authorized_nif=authorized_nif, | ||
| force_full_update=force_full_update, | ||
| ) | ||
| result_entry = { | ||
| "month": current_date.strftime("%Y-%m"), | ||
| "consumption": consumption_result, | ||
| } | ||
| results.append(result_entry) | ||
| current_date = month_end | ||
| # Calculate totals | ||
| total_consumptions_fetched = sum( | ||
| r["consumption"]["stats"]["fetched"] | ||
| for r in results | ||
| if r["consumption"]["success"] | ||
| ) | ||
| total_consumptions_saved = sum( | ||
| r["consumption"]["stats"]["saved"] | ||
| for r in results | ||
| if r["consumption"]["success"] | ||
| ) | ||
| total_consumptions_updated = sum( | ||
| r["consumption"]["stats"]["updated"] | ||
| for r in results | ||
| if r["consumption"]["success"] | ||
| ) | ||
| summary = { | ||
| "success": all(r["consumption"]["success"] for r in results), | ||
| "cups": cups, | ||
| "period": {"start": start_date.isoformat(), "end": end_date.isoformat()}, | ||
| "months_processed": len(results), | ||
| "total_stats": { | ||
| "consumptions_fetched": total_consumptions_fetched, | ||
| "consumptions_saved": total_consumptions_saved, | ||
| "consumptions_updated": total_consumptions_updated, | ||
| }, | ||
| "monthly_results": results, | ||
| } | ||
| _LOGGER.info( | ||
| f"Consumption range update completed: {len(results)} months processed, " | ||
| f"{total_consumptions_fetched} consumptions fetched" | ||
| ) | ||
| return summary | ||
| async def get_stored_consumptions( | ||
| self, | ||
| cups: str, | ||
| start_date: Optional[datetime] = None, | ||
| end_date: Optional[datetime] = None, | ||
| ) -> List[DbConsumption]: | ||
| """Get stored consumptions from database. | ||
| Args: | ||
| cups: CUPS identifier | ||
| start_date: Optional start date filter | ||
| end_date: Optional end date filter | ||
| Returns: | ||
| List of database Consumption objects | ||
| """ | ||
| db_service = await self._get_db_service() | ||
| return await db_service.get_consumptions(cups, start_date, end_date) | ||
| async def get_last_consumption_date(self, cups: str) -> Optional[datetime]: | ||
| """Get the date of the last consumption record in the database. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Datetime of last consumption or None if no data exists | ||
| """ | ||
| db_service = await self._get_db_service() | ||
| latest_consumption = await db_service.get_latest_consumption(cups) | ||
| if latest_consumption: | ||
| return latest_consumption.datetime | ||
| return None | ||
| async def get_daily_consumptions( | ||
| self, cups: str, start_date: datetime, end_date: datetime | ||
| ) -> List[ConsumptionAggregated]: | ||
| """Calculate daily consumption aggregations. | ||
| Args: | ||
| cups: CUPS identifier | ||
| start_date: Start date for aggregation | ||
| end_date: End date for aggregation | ||
| Returns: | ||
| List of daily consumption aggregations | ||
| """ | ||
| # Get hourly consumptions from database | ||
| db_service = await self._get_db_service() | ||
| db_consumptions = await db_service.get_consumptions(cups, start_date, end_date) | ||
| # Convert to Pydantic models for processing | ||
| consumptions = [] | ||
| for db_cons in db_consumptions: | ||
| cons = Consumption( | ||
| datetime=db_cons.datetime, | ||
| delta_h=db_cons.delta_h, | ||
| value_kwh=db_cons.value_kwh, | ||
| surplus_kwh=db_cons.surplus_kwh or 0.0, | ||
| real=db_cons.real or True, | ||
| ) | ||
| consumptions.append(cons) | ||
| # Sort by datetime | ||
| consumptions.sort(key=lambda x: x.datetime) | ||
| # Aggregate by day | ||
| daily_aggregations = {} | ||
| for consumption in consumptions: | ||
| curr_day = consumption.datetime.replace( | ||
| hour=0, minute=0, second=0, microsecond=0 | ||
| ) | ||
| # Determine tariff period | ||
| tariff = get_pvpc_tariff(consumption.datetime) | ||
| # Initialize daily aggregation if not exists | ||
| if curr_day not in daily_aggregations: | ||
| daily_aggregations[curr_day] = { | ||
| "datetime": curr_day, | ||
| "value_kwh": 0.0, | ||
| "value_p1_kwh": 0.0, | ||
| "value_p2_kwh": 0.0, | ||
| "value_p3_kwh": 0.0, | ||
| "surplus_kwh": 0.0, | ||
| "surplus_p1_kwh": 0.0, | ||
| "surplus_p2_kwh": 0.0, | ||
| "surplus_p3_kwh": 0.0, | ||
| "delta_h": 0.0, | ||
| } | ||
| # Add consumption values | ||
| daily_aggregations[curr_day]["value_kwh"] += consumption.value_kwh | ||
| daily_aggregations[curr_day]["surplus_kwh"] += consumption.surplus_kwh | ||
| daily_aggregations[curr_day]["delta_h"] += consumption.delta_h | ||
| # Add by tariff period | ||
| if tariff == "p1": | ||
| daily_aggregations[curr_day]["value_p1_kwh"] += consumption.value_kwh | ||
| daily_aggregations[curr_day][ | ||
| "surplus_p1_kwh" | ||
| ] += consumption.surplus_kwh | ||
| elif tariff == "p2": | ||
| daily_aggregations[curr_day]["value_p2_kwh"] += consumption.value_kwh | ||
| daily_aggregations[curr_day][ | ||
| "surplus_p2_kwh" | ||
| ] += consumption.surplus_kwh | ||
| elif tariff == "p3": | ||
| daily_aggregations[curr_day]["value_p3_kwh"] += consumption.value_kwh | ||
| daily_aggregations[curr_day][ | ||
| "surplus_p3_kwh" | ||
| ] += consumption.surplus_kwh | ||
| # Convert to ConsumptionAggregated objects and round values | ||
| result = [] | ||
| for day_data in sorted( | ||
| daily_aggregations.values(), key=lambda x: x["datetime"] | ||
| ): | ||
| # Round all float values to 2 decimal places | ||
| for key, value in day_data.items(): | ||
| if isinstance(value, float): | ||
| day_data[key] = round(value, 2) | ||
| aggregated = ConsumptionAggregated(**day_data) | ||
| result.append(aggregated) | ||
| return result | ||
| async def get_monthly_consumptions( | ||
| self, | ||
| cups: str, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| cycle_start_day: int = 1, | ||
| ) -> List[ConsumptionAggregated]: | ||
| """Calculate monthly consumption aggregations. | ||
| Args: | ||
| cups: CUPS identifier | ||
| start_date: Start date for aggregation | ||
| end_date: End date for aggregation | ||
| cycle_start_day: Day of month when billing cycle starts (1-30) | ||
| Returns: | ||
| List of monthly consumption aggregations | ||
| """ | ||
| # Get hourly consumptions from database | ||
| db_service = await self._get_db_service() | ||
| db_consumptions = await db_service.get_consumptions(cups, start_date, end_date) | ||
| # Convert to Pydantic models for processing | ||
| consumptions = [] | ||
| for db_cons in db_consumptions: | ||
| cons = Consumption( | ||
| datetime=db_cons.datetime, | ||
| delta_h=db_cons.delta_h, | ||
| value_kwh=db_cons.value_kwh, | ||
| surplus_kwh=db_cons.surplus_kwh or 0.0, | ||
| real=db_cons.real or True, | ||
| ) | ||
| consumptions.append(cons) | ||
| # Sort by datetime | ||
| consumptions.sort(key=lambda x: x.datetime) | ||
| # Calculate cycle offset | ||
| cycle_offset = cycle_start_day - 1 | ||
| # Aggregate by month (considering billing cycle) | ||
| monthly_aggregations = {} | ||
| for consumption in consumptions: | ||
| curr_day = consumption.datetime.replace( | ||
| hour=0, minute=0, second=0, microsecond=0 | ||
| ) | ||
| # Adjust for billing cycle start day | ||
| billing_month_date = (curr_day - timedelta(days=cycle_offset)).replace( | ||
| day=1 | ||
| ) | ||
| # Determine tariff period | ||
| tariff = get_pvpc_tariff(consumption.datetime) | ||
| # Initialize monthly aggregation if not exists | ||
| if billing_month_date not in monthly_aggregations: | ||
| monthly_aggregations[billing_month_date] = { | ||
| "datetime": billing_month_date, | ||
| "value_kwh": 0.0, | ||
| "value_p1_kwh": 0.0, | ||
| "value_p2_kwh": 0.0, | ||
| "value_p3_kwh": 0.0, | ||
| "surplus_kwh": 0.0, | ||
| "surplus_p1_kwh": 0.0, | ||
| "surplus_p2_kwh": 0.0, | ||
| "surplus_p3_kwh": 0.0, | ||
| "delta_h": 0.0, | ||
| } | ||
| # Add consumption values | ||
| monthly_aggregations[billing_month_date][ | ||
| "value_kwh" | ||
| ] += consumption.value_kwh | ||
| monthly_aggregations[billing_month_date][ | ||
| "surplus_kwh" | ||
| ] += consumption.surplus_kwh | ||
| monthly_aggregations[billing_month_date]["delta_h"] += consumption.delta_h | ||
| # Add by tariff period | ||
| if tariff == "p1": | ||
| monthly_aggregations[billing_month_date][ | ||
| "value_p1_kwh" | ||
| ] += consumption.value_kwh | ||
| monthly_aggregations[billing_month_date][ | ||
| "surplus_p1_kwh" | ||
| ] += consumption.surplus_kwh | ||
| elif tariff == "p2": | ||
| monthly_aggregations[billing_month_date][ | ||
| "value_p2_kwh" | ||
| ] += consumption.value_kwh | ||
| monthly_aggregations[billing_month_date][ | ||
| "surplus_p2_kwh" | ||
| ] += consumption.surplus_kwh | ||
| elif tariff == "p3": | ||
| monthly_aggregations[billing_month_date][ | ||
| "value_p3_kwh" | ||
| ] += consumption.value_kwh | ||
| monthly_aggregations[billing_month_date][ | ||
| "surplus_p3_kwh" | ||
| ] += consumption.surplus_kwh | ||
| # Convert to ConsumptionAggregated objects and round values | ||
| result = [] | ||
| for month_data in sorted( | ||
| monthly_aggregations.values(), key=lambda x: x["datetime"] | ||
| ): | ||
| # Round all float values to 2 decimal places | ||
| for key, value in month_data.items(): | ||
| if isinstance(value, float): | ||
| month_data[key] = round(value, 2) | ||
| aggregated = ConsumptionAggregated(**month_data) | ||
| result.append(aggregated) | ||
| return result | ||
| async def get_consumption_summary( | ||
| self, cups: str, target_date: Optional[datetime] = None | ||
| ) -> Dict[str, Any]: | ||
| """Get consumption summary data compatible with EdataHelper attributes. | ||
| Args: | ||
| cups: CUPS identifier | ||
| target_date: Reference date for calculations (defaults to today) | ||
| Returns: | ||
| Dict with summary attributes matching EdataHelper format | ||
| """ | ||
| from datetime import timedelta | ||
| from dateutil.relativedelta import relativedelta | ||
| if target_date is None: | ||
| target_date = datetime.now() | ||
| # Calculate date ranges | ||
| today_starts = target_date.replace(hour=0, minute=0, second=0, microsecond=0) | ||
| yesterday_starts = today_starts - timedelta(days=1) | ||
| month_starts = target_date.replace( | ||
| day=1, hour=0, minute=0, second=0, microsecond=0 | ||
| ) | ||
| last_month_starts = month_starts - relativedelta(months=1) | ||
| # Get daily and monthly aggregations | ||
| daily_consumptions = await self.get_daily_consumptions( | ||
| cups=cups, start_date=yesterday_starts, end_date=today_starts | ||
| ) | ||
| monthly_consumptions = await self.get_monthly_consumptions( | ||
| cups=cups, | ||
| start_date=last_month_starts, | ||
| end_date=month_starts + relativedelta(months=1), | ||
| ) | ||
| # Get all consumptions to find last registered data | ||
| all_consumptions = await self.get_stored_consumptions(cups=cups) | ||
| # Initialize summary attributes | ||
| summary: Dict[str, Any] = { | ||
| # Yesterday consumption | ||
| "yesterday_kWh": None, | ||
| "yesterday_hours": None, | ||
| "yesterday_p1_kWh": None, | ||
| "yesterday_p2_kWh": None, | ||
| "yesterday_p3_kWh": None, | ||
| "yesterday_surplus_kWh": None, | ||
| "yesterday_surplus_p1_kWh": None, | ||
| "yesterday_surplus_p2_kWh": None, | ||
| "yesterday_surplus_p3_kWh": None, | ||
| # Current month consumption | ||
| "month_kWh": None, | ||
| "month_surplus_kWh": None, | ||
| "month_days": None, | ||
| "month_daily_kWh": None, | ||
| "month_p1_kWh": None, | ||
| "month_p2_kWh": None, | ||
| "month_p3_kWh": None, | ||
| "month_surplus_p1_kWh": None, | ||
| "month_surplus_p2_kWh": None, | ||
| "month_surplus_p3_kWh": None, | ||
| # Last month consumption | ||
| "last_month_kWh": None, | ||
| "last_month_surplus_kWh": None, | ||
| "last_month_days": None, | ||
| "last_month_daily_kWh": None, | ||
| "last_month_p1_kWh": None, | ||
| "last_month_p2_kWh": None, | ||
| "last_month_p3_kWh": None, | ||
| "last_month_surplus_p1_kWh": None, | ||
| "last_month_surplus_p2_kWh": None, | ||
| "last_month_surplus_p3_kWh": None, | ||
| # Last registered data | ||
| "last_registered_date": None, | ||
| "last_registered_day_kWh": None, | ||
| "last_registered_day_surplus_kWh": None, | ||
| "last_registered_day_hours": None, | ||
| "last_registered_day_p1_kWh": None, | ||
| "last_registered_day_p2_kWh": None, | ||
| "last_registered_day_p3_kWh": None, | ||
| "last_registered_day_surplus_p1_kWh": None, | ||
| "last_registered_day_surplus_p2_kWh": None, | ||
| "last_registered_day_surplus_p3_kWh": None, | ||
| } | ||
| # Fill yesterday data | ||
| yesterday_data = next( | ||
| ( | ||
| d | ||
| for d in daily_consumptions | ||
| if d.datetime.date() == yesterday_starts.date() | ||
| ), | ||
| None, | ||
| ) | ||
| if yesterday_data: | ||
| summary["yesterday_kWh"] = yesterday_data.value_kwh | ||
| summary["yesterday_hours"] = yesterday_data.delta_h | ||
| summary["yesterday_p1_kWh"] = yesterday_data.value_p1_kwh | ||
| summary["yesterday_p2_kWh"] = yesterday_data.value_p2_kwh | ||
| summary["yesterday_p3_kWh"] = yesterday_data.value_p3_kwh | ||
| summary["yesterday_surplus_kWh"] = yesterday_data.surplus_kwh | ||
| summary["yesterday_surplus_p1_kWh"] = yesterday_data.surplus_p1_kwh | ||
| summary["yesterday_surplus_p2_kWh"] = yesterday_data.surplus_p2_kwh | ||
| summary["yesterday_surplus_p3_kWh"] = yesterday_data.surplus_p3_kwh | ||
| # Fill current month data | ||
| current_month_data = next( | ||
| ( | ||
| m | ||
| for m in monthly_consumptions | ||
| if m.datetime.year == month_starts.year | ||
| and m.datetime.month == month_starts.month | ||
| ), | ||
| None, | ||
| ) | ||
| if current_month_data: | ||
| summary["month_kWh"] = current_month_data.value_kwh | ||
| summary["month_surplus_kWh"] = current_month_data.surplus_kwh | ||
| summary["month_days"] = ( | ||
| current_month_data.delta_h / 24 if current_month_data.delta_h else None | ||
| ) | ||
| summary["month_daily_kWh"] = ( | ||
| (current_month_data.value_kwh / (current_month_data.delta_h / 24)) | ||
| if current_month_data.delta_h and current_month_data.delta_h > 0 | ||
| else None | ||
| ) | ||
| summary["month_p1_kWh"] = current_month_data.value_p1_kwh | ||
| summary["month_p2_kWh"] = current_month_data.value_p2_kwh | ||
| summary["month_p3_kWh"] = current_month_data.value_p3_kwh | ||
| summary["month_surplus_p1_kWh"] = current_month_data.surplus_p1_kwh | ||
| summary["month_surplus_p2_kWh"] = current_month_data.surplus_p2_kwh | ||
| summary["month_surplus_p3_kWh"] = current_month_data.surplus_p3_kwh | ||
| # Fill last month data | ||
| last_month_data = next( | ||
| ( | ||
| m | ||
| for m in monthly_consumptions | ||
| if m.datetime.year == last_month_starts.year | ||
| and m.datetime.month == last_month_starts.month | ||
| ), | ||
| None, | ||
| ) | ||
| if last_month_data: | ||
| summary["last_month_kWh"] = last_month_data.value_kwh | ||
| summary["last_month_surplus_kWh"] = last_month_data.surplus_kwh | ||
| summary["last_month_days"] = ( | ||
| last_month_data.delta_h / 24 if last_month_data.delta_h else None | ||
| ) | ||
| summary["last_month_daily_kWh"] = ( | ||
| (last_month_data.value_kwh / (last_month_data.delta_h / 24)) | ||
| if last_month_data.delta_h and last_month_data.delta_h > 0 | ||
| else None | ||
| ) | ||
| summary["last_month_p1_kWh"] = last_month_data.value_p1_kwh | ||
| summary["last_month_p2_kWh"] = last_month_data.value_p2_kwh | ||
| summary["last_month_p3_kWh"] = last_month_data.value_p3_kwh | ||
| summary["last_month_surplus_p1_kWh"] = last_month_data.surplus_p1_kwh | ||
| summary["last_month_surplus_p2_kWh"] = last_month_data.surplus_p2_kwh | ||
| summary["last_month_surplus_p3_kWh"] = last_month_data.surplus_p3_kwh | ||
| # Fill last registered data | ||
| if all_consumptions: | ||
| # Sort by datetime and get the last one | ||
| last_consumption = max(all_consumptions, key=lambda c: c.datetime) | ||
| summary["last_registered_date"] = last_consumption.datetime | ||
| # Get the last day's aggregated data | ||
| last_day_start = last_consumption.datetime.replace( | ||
| hour=0, minute=0, second=0, microsecond=0 | ||
| ) | ||
| last_day_end = last_day_start + timedelta(days=1) | ||
| last_day_daily = await self.get_daily_consumptions( | ||
| cups=cups, start_date=last_day_start, end_date=last_day_end | ||
| ) | ||
| if last_day_daily: | ||
| last_day_data = last_day_daily[0] | ||
| summary["last_registered_day_kWh"] = last_day_data.value_kwh | ||
| summary["last_registered_day_surplus_kWh"] = last_day_data.surplus_kwh | ||
| summary["last_registered_day_hours"] = last_day_data.delta_h | ||
| summary["last_registered_day_p1_kWh"] = last_day_data.value_p1_kwh | ||
| summary["last_registered_day_p2_kWh"] = last_day_data.value_p2_kwh | ||
| summary["last_registered_day_p3_kWh"] = last_day_data.value_p3_kwh | ||
| summary["last_registered_day_surplus_p1_kWh"] = ( | ||
| last_day_data.surplus_p1_kwh | ||
| ) | ||
| summary["last_registered_day_surplus_p2_kWh"] = ( | ||
| last_day_data.surplus_p2_kwh | ||
| ) | ||
| summary["last_registered_day_surplus_p3_kWh"] = ( | ||
| last_day_data.surplus_p3_kwh | ||
| ) | ||
| # Round numeric values to 2 decimal places | ||
| for key, value in summary.items(): | ||
| if isinstance(value, float): | ||
| summary[key] = round(value, 2) | ||
| return summary |
| """Contract service for fetching and managing contract data.""" | ||
| import logging | ||
| from datetime import datetime | ||
| from typing import Any, Dict, List, Optional | ||
| from edata.connectors.datadis import DatadisConnector | ||
| from edata.services.database import ContractModel, DatabaseService, get_database_service | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class ContractService: | ||
| """Service for managing contract data fetching and storage.""" | ||
| def __init__( | ||
| self, | ||
| datadis_connector: DatadisConnector, | ||
| storage_dir: Optional[str] = None, | ||
| ): | ||
| """Initialize contract service. | ||
| Args: | ||
| datadis_connector: Configured Datadis connector instance | ||
| storage_dir: Directory for database and cache storage | ||
| """ | ||
| self._datadis = datadis_connector | ||
| self._storage_dir = storage_dir | ||
| self._db_service = None | ||
| async def _get_db_service(self) -> DatabaseService: | ||
| """Get database service, initializing if needed.""" | ||
| if self._db_service is None: | ||
| self._db_service = await get_database_service(self._storage_dir) | ||
| return self._db_service | ||
| async def update_contracts( | ||
| self, cups: str, distributor_code: str, authorized_nif: Optional[str] = None | ||
| ) -> Dict[str, Any]: | ||
| """Update contract data for a CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| distributor_code: Distributor code for the CUPS | ||
| authorized_nif: Optional authorized NIF for access | ||
| Returns: | ||
| Dict with operation results and statistics | ||
| """ | ||
| _LOGGER.info(f"Updating contracts for CUPS {cups[-5:]}") | ||
| try: | ||
| # Fetch contract data from Datadis | ||
| contracts_data = await self._datadis.get_contract_detail( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| authorized_nif=authorized_nif, | ||
| ) | ||
| if not contracts_data: | ||
| _LOGGER.warning(f"No contract data found for CUPS {cups[-5:]}") | ||
| return { | ||
| "success": True, | ||
| "stats": { | ||
| "fetched": 0, | ||
| "saved": 0, | ||
| "updated": 0, | ||
| "total_stored": 0, | ||
| }, | ||
| } | ||
| # Get existing contracts to avoid duplicates | ||
| db_service = await self._get_db_service() | ||
| existing = await db_service.get_contracts(cups=cups) | ||
| existing_periods = {(c.date_start, c.date_end) for c in existing} | ||
| # Save contracts to database | ||
| saved_count = 0 | ||
| updated_count = 0 | ||
| for contract in contracts_data: | ||
| contract_dict = contract.model_dump() | ||
| contract_dict["cups"] = cups | ||
| # Check if this contract period already exists | ||
| period_key = (contract.date_start, contract.date_end) | ||
| if period_key in existing_periods: | ||
| updated_count += 1 | ||
| _LOGGER.debug( | ||
| f"Updating existing contract for CUPS {cups[-5:]} " | ||
| f"period {contract.date_start.date()}-{contract.date_end.date()}" | ||
| ) | ||
| else: | ||
| saved_count += 1 | ||
| _LOGGER.debug( | ||
| f"Saving new contract for CUPS {cups[-5:]} " | ||
| f"period {contract.date_start.date()}-{contract.date_end.date()}" | ||
| ) | ||
| # Save to database | ||
| await db_service.save_contract(contract_dict) | ||
| # Get total contracts stored for this CUPS | ||
| all_contracts = await db_service.get_contracts(cups=cups) | ||
| total_stored = len(all_contracts) | ||
| result = { | ||
| "success": True, | ||
| "stats": { | ||
| "fetched": len(contracts_data), | ||
| "saved": saved_count, | ||
| "updated": updated_count, | ||
| "total_stored": total_stored, | ||
| }, | ||
| } | ||
| _LOGGER.info( | ||
| f"Contract update completed for CUPS {cups[-5:]}: " | ||
| f"{len(contracts_data)} fetched, {saved_count} saved, {updated_count} updated" | ||
| ) | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error updating contracts for CUPS {cups[-5:]}: {str(e)}") | ||
| return { | ||
| "success": False, | ||
| "error": str(e), | ||
| "stats": {"fetched": 0, "saved": 0, "updated": 0, "total_stored": 0}, | ||
| } | ||
| async def get_contracts( | ||
| self, | ||
| cups: str, | ||
| start_date: Optional[datetime] = None, | ||
| end_date: Optional[datetime] = None, | ||
| ) -> List[ContractModel]: | ||
| """Get stored contract data for a CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| start_date: Optional start date filter | ||
| end_date: Optional end date filter | ||
| Returns: | ||
| List of Contract objects | ||
| """ | ||
| _LOGGER.debug( | ||
| f"Getting contracts for CUPS {cups[-5:]}" | ||
| f"{f' from {start_date.date()}' if start_date else ''}" | ||
| f"{f' to {end_date.date()}' if end_date else ''}" | ||
| ) | ||
| try: | ||
| db_service = await self._get_db_service() | ||
| contracts = await db_service.get_contracts(cups=cups) | ||
| # Apply date filters if provided | ||
| if start_date or end_date: | ||
| filtered_contracts = [] | ||
| for contract in contracts: | ||
| # Check if contract period overlaps with requested period | ||
| if start_date and contract.date_end < start_date: | ||
| continue | ||
| if end_date and contract.date_start > end_date: | ||
| continue | ||
| filtered_contracts.append(contract) | ||
| contracts = filtered_contracts | ||
| _LOGGER.debug(f"Found {len(contracts)} contracts for CUPS {cups[-5:]}") | ||
| return contracts | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting contracts for CUPS {cups[-5:]}: {str(e)}") | ||
| return [] | ||
| async def get_active_contract( | ||
| self, cups: str, reference_date: Optional[datetime] = None | ||
| ) -> Optional[ContractModel]: | ||
| """Get the active contract for a CUPS at a specific date. | ||
| Args: | ||
| cups: CUPS identifier | ||
| reference_date: Date to check for active contract (defaults to now) | ||
| Returns: | ||
| Active contract if found, None otherwise | ||
| """ | ||
| if reference_date is None: | ||
| reference_date = datetime.now() | ||
| _LOGGER.debug( | ||
| f"Getting active contract for CUPS {cups[-5:]} at {reference_date.date()}" | ||
| ) | ||
| try: | ||
| contracts = await self.get_contracts(cups=cups) | ||
| for contract in contracts: | ||
| if contract.date_start <= reference_date <= contract.date_end: | ||
| _LOGGER.debug( | ||
| f"Found active contract for CUPS {cups[-5:]} " | ||
| f"period {contract.date_start.date()}-{contract.date_end.date()}" | ||
| ) | ||
| return contract | ||
| _LOGGER.warning( | ||
| f"No active contract found for CUPS {cups[-5:]} at {reference_date.date()}" | ||
| ) | ||
| return None | ||
| except Exception as e: | ||
| _LOGGER.error( | ||
| f"Error getting active contract for CUPS {cups[-5:]}: {str(e)}" | ||
| ) | ||
| return None | ||
| async def get_latest_contract(self, cups: str) -> Optional[ContractModel]: | ||
| """Get the most recent contract for a CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Latest contract if found, None otherwise | ||
| """ | ||
| _LOGGER.debug(f"Getting latest contract for CUPS {cups[-5:]}") | ||
| try: | ||
| contracts = await self.get_contracts(cups=cups) | ||
| if not contracts: | ||
| _LOGGER.warning(f"No contracts found for CUPS {cups[-5:]}") | ||
| return None | ||
| # Sort by end date descending to get the most recent | ||
| latest_contract = max(contracts, key=lambda c: c.date_end) | ||
| _LOGGER.debug( | ||
| f"Found latest contract for CUPS {cups[-5:]} " | ||
| f"period {latest_contract.date_start.date()}-{latest_contract.date_end.date()}" | ||
| ) | ||
| return latest_contract | ||
| except Exception as e: | ||
| _LOGGER.error( | ||
| f"Error getting latest contract for CUPS {cups[-5:]}: {str(e)}" | ||
| ) | ||
| return None | ||
| async def get_contract_summary(self, cups: str) -> Dict[str, Any]: | ||
| """Get contract summary attributes for a CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Dict with contract summary attributes | ||
| """ | ||
| _LOGGER.debug(f"Getting contract summary for CUPS {cups[-5:]}") | ||
| try: | ||
| # Get the most recent contract | ||
| latest_contract = await self.get_latest_contract(cups) | ||
| if not latest_contract: | ||
| _LOGGER.warning(f"No contracts found for CUPS {cups[-5:]}") | ||
| return { | ||
| "contract_p1_kW": None, | ||
| "contract_p2_kW": None, | ||
| } | ||
| summary = { | ||
| "contract_p1_kW": latest_contract.power_p1, | ||
| "contract_p2_kW": latest_contract.power_p2, | ||
| # Add other contract-related summary attributes here as needed | ||
| } | ||
| _LOGGER.debug(f"Contract summary calculated for CUPS {cups[-5:]}") | ||
| return summary | ||
| except Exception as e: | ||
| _LOGGER.error( | ||
| f"Error getting contract summary for CUPS {cups[-5:]}: {str(e)}" | ||
| ) | ||
| return { | ||
| "contract_p1_kW": None, | ||
| "contract_p2_kW": None, | ||
| } | ||
| async def get_contract_stats(self, cups: str) -> Dict[str, Any]: | ||
| """Get statistics about contracts for a CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Dict with contract statistics | ||
| """ | ||
| _LOGGER.debug(f"Getting contract statistics for CUPS {cups[-5:]}") | ||
| try: | ||
| contracts = await self.get_contracts(cups=cups) | ||
| if not contracts: | ||
| return { | ||
| "total_contracts": 0, | ||
| "date_range": None, | ||
| "power_ranges": {}, | ||
| } | ||
| # Calculate date range | ||
| earliest_start = min(c.date_start for c in contracts) | ||
| latest_end = max(c.date_end for c in contracts) | ||
| # Calculate power ranges | ||
| p1_powers = [c.power_p1 for c in contracts if c.power_p1 is not None] | ||
| p2_powers = [c.power_p2 for c in contracts if c.power_p2 is not None] | ||
| power_ranges = {} | ||
| if p1_powers: | ||
| power_ranges["p1_kw"] = {"min": min(p1_powers), "max": max(p1_powers)} | ||
| if p2_powers: | ||
| power_ranges["p2_kw"] = {"min": min(p2_powers), "max": max(p2_powers)} | ||
| stats = { | ||
| "total_contracts": len(contracts), | ||
| "date_range": { | ||
| "earliest_start": earliest_start, | ||
| "latest_end": latest_end, | ||
| }, | ||
| "power_ranges": power_ranges, | ||
| } | ||
| _LOGGER.debug(f"Contract statistics calculated for CUPS {cups[-5:]}") | ||
| return stats | ||
| except Exception as e: | ||
| _LOGGER.error( | ||
| f"Error getting contract statistics for CUPS {cups[-5:]}: {str(e)}" | ||
| ) | ||
| return {} |
| """Database service for edata using SQLModel and SQLite with async support.""" | ||
| import hashlib | ||
| import os | ||
| from datetime import datetime as DateTime | ||
| from typing import List, Optional | ||
| from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine | ||
| from sqlmodel import SQLModel, desc, select | ||
| from edata.const import DEFAULT_STORAGE_DIR | ||
| from edata.models.database import ( | ||
| BillingModel, | ||
| ConsumptionModel, | ||
| ContractModel, | ||
| MaxPowerModel, | ||
| PVPCPricesModel, | ||
| SupplyModel, | ||
| ) | ||
| class DatabaseService: | ||
| """Service for managing the SQLite database with async support.""" | ||
| def __init__(self, storage_dir: Optional[str] = None): | ||
| """Initialize database service. | ||
| Args: | ||
| storage_dir: Directory to store database, defaults to same as cache | ||
| """ | ||
| if storage_dir is None: | ||
| storage_dir = DEFAULT_STORAGE_DIR | ||
| self._db_dir = os.path.join(storage_dir) | ||
| os.makedirs(self._db_dir, exist_ok=True) | ||
| db_path = os.path.join(self._db_dir, "edata.db") | ||
| # Use aiosqlite for async SQLite operations | ||
| self._engine = create_async_engine(f"sqlite+aiosqlite:///{db_path}") | ||
| async def create_tables(self): | ||
| """Create tables asynchronously.""" | ||
| async with self._engine.begin() as conn: | ||
| await conn.run_sync(SQLModel.metadata.create_all) | ||
| def get_session(self) -> AsyncSession: | ||
| """Get an async database session.""" | ||
| return AsyncSession(self._engine) | ||
| async def save_supply(self, supply_data: dict) -> SupplyModel: | ||
| """Save or update a supply record.""" | ||
| async with self.get_session() as session: | ||
| # Check if supply exists | ||
| existing = await session.get(SupplyModel, supply_data["cups"]) | ||
| if existing: | ||
| # Update existing record | ||
| for key, value in supply_data.items(): | ||
| if hasattr(existing, key) and key != "cups": | ||
| setattr(existing, key, value) | ||
| existing.updated_at = DateTime.now() | ||
| session.add(existing) | ||
| await session.commit() | ||
| await session.refresh(existing) | ||
| return existing | ||
| else: | ||
| # Create new record | ||
| supply = SupplyModel(**supply_data) | ||
| session.add(supply) | ||
| await session.commit() | ||
| await session.refresh(supply) | ||
| return supply | ||
| async def save_contract(self, contract_data: dict) -> ContractModel: | ||
| """Save or update a contract record.""" | ||
| async with self.get_session() as session: | ||
| # Check if contract exists (by cups + date_start) | ||
| stmt = select(ContractModel).where( | ||
| ContractModel.cups == contract_data["cups"], | ||
| ContractModel.date_start == contract_data["date_start"], | ||
| ) | ||
| result = await session.execute(stmt) | ||
| existing = result.scalar_one_or_none() | ||
| if existing: | ||
| # Update existing record | ||
| for key, value in contract_data.items(): | ||
| if hasattr(existing, key): | ||
| setattr(existing, key, value) | ||
| existing.updated_at = DateTime.now() | ||
| session.add(existing) | ||
| await session.commit() | ||
| await session.refresh(existing) | ||
| return existing | ||
| else: | ||
| # Create new record | ||
| contract = ContractModel(**contract_data) | ||
| session.add(contract) | ||
| await session.commit() | ||
| await session.refresh(contract) | ||
| return contract | ||
| async def save_consumption(self, consumption_data: dict) -> ConsumptionModel: | ||
| """Save or update a consumption record.""" | ||
| async with self.get_session() as session: | ||
| # Check if consumption exists (by cups + datetime) | ||
| stmt = select(ConsumptionModel).where( | ||
| ConsumptionModel.cups == consumption_data["cups"], | ||
| ConsumptionModel.datetime == consumption_data["datetime"], | ||
| ) | ||
| result = await session.execute(stmt) | ||
| existing = result.scalar_one_or_none() | ||
| if existing: | ||
| # Update existing record | ||
| for key, value in consumption_data.items(): | ||
| if hasattr(existing, key): | ||
| setattr(existing, key, value) | ||
| existing.updated_at = DateTime.now() | ||
| session.add(existing) | ||
| await session.commit() | ||
| await session.refresh(existing) | ||
| return existing | ||
| else: | ||
| # Create new record | ||
| consumption = ConsumptionModel(**consumption_data) | ||
| session.add(consumption) | ||
| await session.commit() | ||
| await session.refresh(consumption) | ||
| return consumption | ||
| async def save_maxpower(self, maxpower_data: dict) -> MaxPowerModel: | ||
| """Save or update a maxpower record.""" | ||
| async with self.get_session() as session: | ||
| # Check if maxpower exists (by cups + datetime) | ||
| stmt = select(MaxPowerModel).where( | ||
| MaxPowerModel.cups == maxpower_data["cups"], | ||
| MaxPowerModel.datetime == maxpower_data["datetime"], | ||
| ) | ||
| result = await session.execute(stmt) | ||
| existing = result.scalar_one_or_none() | ||
| if existing: | ||
| # Update existing record | ||
| for key, value in maxpower_data.items(): | ||
| if hasattr(existing, key): | ||
| setattr(existing, key, value) | ||
| existing.updated_at = DateTime.now() | ||
| session.add(existing) | ||
| await session.commit() | ||
| await session.refresh(existing) | ||
| return existing | ||
| else: | ||
| # Create new record | ||
| maxpower = MaxPowerModel(**maxpower_data) | ||
| session.add(maxpower) | ||
| await session.commit() | ||
| await session.refresh(maxpower) | ||
| return maxpower | ||
| async def get_supply(self, cups: str) -> Optional[SupplyModel]: | ||
| """Get a supply by CUPS.""" | ||
| async with self.get_session() as session: | ||
| return await session.get(SupplyModel, cups) | ||
| async def get_supplies(self, cups: Optional[str] = None) -> List[SupplyModel]: | ||
| """Get supplies, optionally filtered by CUPS.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(SupplyModel) | ||
| if cups: | ||
| stmt = stmt.where(SupplyModel.cups == cups) | ||
| result = await session.execute(stmt) | ||
| return list(result.scalars().all()) | ||
| async def get_latest_supply( | ||
| self, cups: Optional[str] = None | ||
| ) -> Optional[SupplyModel]: | ||
| """Get the most recently updated supply, optionally filtered by CUPS.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(SupplyModel) | ||
| if cups: | ||
| stmt = stmt.where(SupplyModel.cups == cups) | ||
| stmt = stmt.order_by(desc(SupplyModel.updated_at)) | ||
| result = await session.execute(stmt) | ||
| return result.scalar_one_or_none() | ||
| async def get_contracts(self, cups: str) -> List[ContractModel]: | ||
| """Get all contracts for a CUPS.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(ContractModel).where(ContractModel.cups == cups) | ||
| result = await session.execute(stmt) | ||
| return list(result.scalars().all()) | ||
| async def get_latest_contract(self, cups: str) -> Optional[ContractModel]: | ||
| """Get the most recently started contract for a CUPS.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(ContractModel).where(ContractModel.cups == cups) | ||
| stmt = stmt.order_by(desc(ContractModel.date_start)) | ||
| result = await session.execute(stmt) | ||
| return result.scalar_one_or_none() | ||
| async def get_consumptions( | ||
| self, | ||
| cups: str, | ||
| start_date: Optional[DateTime] = None, | ||
| end_date: Optional[DateTime] = None, | ||
| ) -> List[ConsumptionModel]: | ||
| """Get consumptions for a CUPS within date range.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(ConsumptionModel).where(ConsumptionModel.cups == cups) | ||
| if start_date: | ||
| stmt = stmt.where(ConsumptionModel.datetime >= start_date) | ||
| if end_date: | ||
| stmt = stmt.where(ConsumptionModel.datetime <= end_date) | ||
| result = await session.execute(stmt) | ||
| return list(result.scalars().all()) | ||
| async def get_latest_consumption(self, cups: str) -> Optional[ConsumptionModel]: | ||
| """Get the most recent consumption record for a CUPS.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(ConsumptionModel).where(ConsumptionModel.cups == cups) | ||
| stmt = stmt.order_by(desc(ConsumptionModel.datetime)) | ||
| result = await session.execute(stmt) | ||
| return result.scalar_one_or_none() | ||
| async def get_maxpower_readings( | ||
| self, | ||
| cups: str, | ||
| start_date: Optional[DateTime] = None, | ||
| end_date: Optional[DateTime] = None, | ||
| ) -> List[MaxPowerModel]: | ||
| """Get maxpower readings for a CUPS within date range.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(MaxPowerModel).where(MaxPowerModel.cups == cups) | ||
| if start_date: | ||
| stmt = stmt.where(MaxPowerModel.datetime >= start_date) | ||
| if end_date: | ||
| stmt = stmt.where(MaxPowerModel.datetime <= end_date) | ||
| result = await session.execute(stmt) | ||
| return list(result.scalars().all()) | ||
| async def get_latest_maxpower(self, cups: str) -> Optional[MaxPowerModel]: | ||
| """Get the most recent maxpower reading for a CUPS.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(MaxPowerModel).where(MaxPowerModel.cups == cups) | ||
| stmt = stmt.order_by(desc(MaxPowerModel.datetime)) | ||
| result = await session.execute(stmt) | ||
| return result.scalar_one_or_none() | ||
| async def save_pvpc_price(self, price_data: dict) -> PVPCPricesModel: | ||
| """Save or update a PVPC price record.""" | ||
| async with self.get_session() as session: | ||
| # Check if price exists (by datetime and geo_id) | ||
| stmt = select(PVPCPricesModel).where( | ||
| PVPCPricesModel.datetime == price_data["datetime"] | ||
| ) | ||
| if "geo_id" in price_data: | ||
| stmt = stmt.where(PVPCPricesModel.geo_id == price_data["geo_id"]) | ||
| result = await session.execute(stmt) | ||
| existing = result.scalar_one_or_none() | ||
| if existing: | ||
| # Update existing record | ||
| for key, value in price_data.items(): | ||
| if hasattr(existing, key): | ||
| setattr(existing, key, value) | ||
| existing.updated_at = DateTime.now() | ||
| session.add(existing) | ||
| await session.commit() | ||
| await session.refresh(existing) | ||
| return existing | ||
| else: | ||
| # Create new record | ||
| price = PVPCPricesModel(**price_data) | ||
| session.add(price) | ||
| await session.commit() | ||
| await session.refresh(price) | ||
| return price | ||
| async def get_pvpc_prices( | ||
| self, | ||
| start_date: Optional[DateTime] = None, | ||
| end_date: Optional[DateTime] = None, | ||
| geo_id: Optional[int] = None, | ||
| ) -> List[PVPCPricesModel]: | ||
| """Get PVPC prices within date range.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(PVPCPricesModel) | ||
| if start_date: | ||
| stmt = stmt.where(PVPCPricesModel.datetime >= start_date) | ||
| if end_date: | ||
| stmt = stmt.where(PVPCPricesModel.datetime <= end_date) | ||
| if geo_id is not None: | ||
| stmt = stmt.where(PVPCPricesModel.geo_id == geo_id) | ||
| result = await session.execute(stmt) | ||
| return list(result.scalars().all()) | ||
| async def get_latest_pvpc_price( | ||
| self, geo_id: Optional[int] = None | ||
| ) -> Optional[PVPCPricesModel]: | ||
| """Get the most recent PVPC price, optionally filtered by geo_id.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(PVPCPricesModel) | ||
| if geo_id is not None: | ||
| stmt = stmt.where(PVPCPricesModel.geo_id == geo_id) | ||
| stmt = stmt.order_by(desc(PVPCPricesModel.datetime)) | ||
| result = await session.execute(stmt) | ||
| return result.scalar_one_or_none() | ||
| async def save_billing(self, billing_data: dict) -> BillingModel: | ||
| """Save or update a billing record.""" | ||
| async with self.get_session() as session: | ||
| # Check if billing exists (by cups + datetime + pricing_config_hash) | ||
| stmt = select(BillingModel).where( | ||
| BillingModel.cups == billing_data["cups"], | ||
| BillingModel.datetime == billing_data["datetime"], | ||
| BillingModel.pricing_config_hash == billing_data["pricing_config_hash"], | ||
| ) | ||
| result = await session.execute(stmt) | ||
| existing = result.scalar_one_or_none() | ||
| if existing: | ||
| # Update existing record | ||
| for key, value in billing_data.items(): | ||
| if hasattr(existing, key): | ||
| setattr(existing, key, value) | ||
| existing.updated_at = DateTime.now() | ||
| session.add(existing) | ||
| await session.commit() | ||
| await session.refresh(existing) | ||
| return existing | ||
| else: | ||
| # Create new record | ||
| billing = BillingModel(**billing_data) | ||
| session.add(billing) | ||
| await session.commit() | ||
| await session.refresh(billing) | ||
| return billing | ||
| async def get_billing( | ||
| self, | ||
| cups: str, | ||
| start_date: Optional[DateTime] = None, | ||
| end_date: Optional[DateTime] = None, | ||
| pricing_config_hash: Optional[str] = None, | ||
| ) -> List[BillingModel]: | ||
| """Get billing records for a CUPS within date range.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(BillingModel).where(BillingModel.cups == cups) | ||
| if start_date: | ||
| stmt = stmt.where(BillingModel.datetime >= start_date) | ||
| if end_date: | ||
| stmt = stmt.where(BillingModel.datetime <= end_date) | ||
| if pricing_config_hash: | ||
| stmt = stmt.where( | ||
| BillingModel.pricing_config_hash == pricing_config_hash | ||
| ) | ||
| result = await session.execute(stmt) | ||
| return list(result.scalars().all()) | ||
| async def get_latest_billing( | ||
| self, cups: str, pricing_config_hash: Optional[str] = None | ||
| ) -> Optional[BillingModel]: | ||
| """Get the most recent billing record for a CUPS.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(BillingModel).where(BillingModel.cups == cups) | ||
| if pricing_config_hash: | ||
| stmt = stmt.where( | ||
| BillingModel.pricing_config_hash == pricing_config_hash | ||
| ) | ||
| stmt = stmt.order_by(desc(BillingModel.datetime)) | ||
| result = await session.execute(stmt) | ||
| return result.scalar_one_or_none() | ||
| async def delete_billing( | ||
| self, | ||
| cups: str, | ||
| pricing_config_hash: str, | ||
| start_date: Optional[DateTime] = None, | ||
| end_date: Optional[DateTime] = None, | ||
| ) -> int: | ||
| """Delete billing records for a specific configuration and optional date range.""" | ||
| async with self.get_session() as session: | ||
| stmt = select(BillingModel).where( | ||
| BillingModel.cups == cups, | ||
| BillingModel.pricing_config_hash == pricing_config_hash, | ||
| ) | ||
| if start_date: | ||
| stmt = stmt.where(BillingModel.datetime >= start_date) | ||
| if end_date: | ||
| stmt = stmt.where(BillingModel.datetime <= end_date) | ||
| result = await session.execute(stmt) | ||
| billing_records = list(result.scalars().all()) | ||
| count = len(billing_records) | ||
| for record in billing_records: | ||
| await session.delete(record) | ||
| await session.commit() | ||
| return count | ||
| @staticmethod | ||
| def generate_pricing_config_hash(pricing_rules_dict: dict) -> str: | ||
| """Generate a hash for pricing rules configuration.""" | ||
| # Create a normalized string representation for hashing | ||
| config_str = str(sorted(pricing_rules_dict.items())) | ||
| return hashlib.sha256(config_str.encode()).hexdigest()[:16] | ||
| async def save_from_pydantic_models( | ||
| self, | ||
| cups: str, | ||
| supplies: List, | ||
| contracts: List, | ||
| consumptions: List, | ||
| maximeter: List, | ||
| ): | ||
| """Save data from Pydantic models to database.""" | ||
| # Save supplies | ||
| for supply in supplies: | ||
| supply_dict = supply.model_dump() | ||
| await self.save_supply(supply_dict) | ||
| # Save contracts with CUPS | ||
| for contract in contracts: | ||
| contract_dict = contract.model_dump() | ||
| contract_dict["cups"] = cups | ||
| await self.save_contract(contract_dict) | ||
| # Save consumptions with CUPS | ||
| for consumption in consumptions: | ||
| consumption_dict = consumption.model_dump() | ||
| consumption_dict["cups"] = cups | ||
| await self.save_consumption(consumption_dict) | ||
| # Save maximeter readings with CUPS | ||
| for maxpower in maximeter: | ||
| maxpower_dict = maxpower.model_dump() | ||
| maxpower_dict["cups"] = cups | ||
| await self.save_maxpower(maxpower_dict) | ||
| # Global database service instance | ||
| _db_service: Optional[DatabaseService] = None | ||
| async def get_database_service(storage_dir: Optional[str] = None) -> DatabaseService: | ||
| """Get the global database service instance.""" | ||
| global _db_service | ||
| if _db_service is None: | ||
| _db_service = DatabaseService(storage_dir) | ||
| # Initialize tables on first access | ||
| await _db_service.create_tables() | ||
| return _db_service |
| """Maximeter service for fetching and updating maximum power data.""" | ||
| import logging | ||
| from datetime import datetime, timedelta | ||
| from typing import Any, Dict, List, Optional | ||
| from edata.connectors.datadis import DatadisConnector | ||
| from edata.models.maximeter import MaxPower | ||
| from edata.services.database import DatabaseService, get_database_service | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class MaximeterService: | ||
| """Service for managing maximum power data fetching and storage.""" | ||
| def __init__( | ||
| self, | ||
| datadis_connector: DatadisConnector, | ||
| storage_dir: Optional[str] = None, | ||
| ): | ||
| """Initialize maximeter service. | ||
| Args: | ||
| datadis_connector: Configured Datadis connector instance | ||
| storage_dir: Directory for database and cache storage | ||
| """ | ||
| self._datadis = datadis_connector | ||
| self._storage_dir = storage_dir | ||
| self._db_service = None | ||
| async def _get_db_service(self) -> DatabaseService: | ||
| """Get database service, initializing if needed.""" | ||
| if self._db_service is None: | ||
| self._db_service = await get_database_service(self._storage_dir) | ||
| return self._db_service | ||
| async def update_maxpower( | ||
| self, | ||
| cups: str, | ||
| distributor_code: str, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| authorized_nif: Optional[str] = None, | ||
| force_full_update: bool = False, | ||
| ) -> Dict[str, Any]: | ||
| """Update maximeter (maximum power) data for a CUPS in the specified date range. | ||
| Args: | ||
| cups: CUPS identifier | ||
| distributor_code: Distributor company code | ||
| start_date: Start date for maxpower data | ||
| end_date: End date for maxpower data | ||
| authorized_nif: Authorized NIF if accessing on behalf of someone | ||
| force_full_update: If True, fetch all data ignoring existing records | ||
| Returns: | ||
| Dict with operation results and statistics | ||
| """ | ||
| _LOGGER.info( | ||
| f"Updating maxpower for CUPS {cups[-5:]:>5} from {start_date.date()} to {end_date.date()}" | ||
| ) | ||
| # Determine actual start date based on existing data | ||
| actual_start_date = start_date | ||
| if not force_full_update: | ||
| last_maxpower_date = await self.get_last_maxpower_date(cups) | ||
| if last_maxpower_date: | ||
| # Start from the day after the last maxpower reading | ||
| actual_start_date = max( | ||
| start_date, last_maxpower_date + timedelta(hours=1) | ||
| ) | ||
| _LOGGER.info( | ||
| f"Found existing maxpower data up to {last_maxpower_date.date()}, fetching from {actual_start_date.date()}" | ||
| ) | ||
| else: | ||
| _LOGGER.info( | ||
| f"No existing maxpower data found for CUPS {cups[-5:]:>5}, fetching all data" | ||
| ) | ||
| # If actual start date is beyond end date, no new data needed | ||
| if actual_start_date >= end_date: | ||
| _LOGGER.info( | ||
| f"No new maxpower data needed for CUPS {cups[-5:]:>5} (up to date)" | ||
| ) | ||
| return { | ||
| "success": True, | ||
| "cups": cups, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": actual_start_date.isoformat(), | ||
| }, | ||
| "stats": { | ||
| "fetched": 0, | ||
| "saved": 0, | ||
| "updated": 0, | ||
| "skipped": "up_to_date", | ||
| }, | ||
| "message": "Maxpower data is up to date", | ||
| } | ||
| try: | ||
| # Fetch maxpower data from datadis (only missing data) | ||
| maxpower_readings = await self._datadis.get_max_power( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=actual_start_date, | ||
| end_date=end_date, | ||
| authorized_nif=authorized_nif, | ||
| ) | ||
| # Save to database | ||
| saved_count = 0 | ||
| updated_count = 0 | ||
| for maxpower in maxpower_readings: | ||
| # Convert Pydantic model to dict and add CUPS | ||
| maxpower_dict = maxpower.model_dump() | ||
| maxpower_dict["cups"] = cups | ||
| # Check if maxpower reading already exists | ||
| db_service = await self._get_db_service() | ||
| existing = await db_service.get_maxpower_readings( | ||
| cups=cups, start_date=maxpower.datetime, end_date=maxpower.datetime | ||
| ) | ||
| if existing: | ||
| updated_count += 1 | ||
| else: | ||
| saved_count += 1 | ||
| await db_service.save_maxpower(maxpower_dict) | ||
| result = { | ||
| "success": True, | ||
| "cups": cups, | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": actual_start_date.isoformat(), | ||
| }, | ||
| "stats": { | ||
| "fetched": len(maxpower_readings), | ||
| "saved": saved_count, | ||
| "updated": updated_count, | ||
| }, | ||
| } | ||
| if actual_start_date > start_date: | ||
| result["message"] = ( | ||
| f"Fetched only missing maxpower data from {actual_start_date.date()}" | ||
| ) | ||
| _LOGGER.info( | ||
| f"Maxpower update completed: {len(maxpower_readings)} fetched, " | ||
| f"{saved_count} saved, {updated_count} updated" | ||
| ) | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error updating maxpower for CUPS {cups}: {str(e)}") | ||
| return { | ||
| "success": False, | ||
| "cups": cups, | ||
| "error": str(e), | ||
| "period": { | ||
| "start": start_date.isoformat(), | ||
| "end": end_date.isoformat(), | ||
| "actual_start": ( | ||
| actual_start_date.isoformat() | ||
| if "actual_start_date" in locals() | ||
| else start_date.isoformat() | ||
| ), | ||
| }, | ||
| } | ||
| async def update_maxpower_range_by_months( | ||
| self, | ||
| cups: str, | ||
| distributor_code: str, | ||
| start_date: datetime, | ||
| end_date: datetime, | ||
| authorized_nif: Optional[str] = None, | ||
| force_full_update: bool = False, | ||
| ) -> Dict[str, Any]: | ||
| """Update maxpower data month by month to respect datadis limits. | ||
| Args: | ||
| cups: CUPS identifier | ||
| distributor_code: Distributor company code | ||
| start_date: Start date for maxpower data | ||
| end_date: End date for maxpower data | ||
| authorized_nif: Authorized NIF if accessing on behalf of someone | ||
| force_full_update: If True, fetch all data ignoring existing records | ||
| Returns: | ||
| Dict with operation results and statistics for all months | ||
| """ | ||
| _LOGGER.info( | ||
| f"Updating maxpower range for CUPS {cups[-5:]:>5} " | ||
| f"from {start_date.date()} to {end_date.date()} by months" | ||
| ) | ||
| results = [] | ||
| current_date = start_date | ||
| while current_date < end_date: | ||
| # Calculate month end | ||
| if current_date.month == 12: | ||
| month_end = current_date.replace( | ||
| year=current_date.year + 1, month=1, day=1 | ||
| ) | ||
| else: | ||
| month_end = current_date.replace(month=current_date.month + 1, day=1) | ||
| # Don't go past the requested end date | ||
| actual_end = min(month_end, end_date) | ||
| # Update maxpower for this month | ||
| maxpower_result = await self.update_maxpower( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=current_date, | ||
| end_date=actual_end, | ||
| authorized_nif=authorized_nif, | ||
| force_full_update=force_full_update, | ||
| ) | ||
| result_entry = { | ||
| "month": current_date.strftime("%Y-%m"), | ||
| "maxpower": maxpower_result, | ||
| } | ||
| results.append(result_entry) | ||
| current_date = month_end | ||
| # Calculate totals | ||
| total_maxpower_fetched = sum( | ||
| r["maxpower"]["stats"]["fetched"] | ||
| for r in results | ||
| if r["maxpower"]["success"] | ||
| ) | ||
| total_maxpower_saved = sum( | ||
| r["maxpower"]["stats"]["saved"] for r in results if r["maxpower"]["success"] | ||
| ) | ||
| total_maxpower_updated = sum( | ||
| r["maxpower"]["stats"]["updated"] | ||
| for r in results | ||
| if r["maxpower"]["success"] | ||
| ) | ||
| summary = { | ||
| "success": all(r["maxpower"]["success"] for r in results), | ||
| "cups": cups, | ||
| "period": {"start": start_date.isoformat(), "end": end_date.isoformat()}, | ||
| "months_processed": len(results), | ||
| "total_stats": { | ||
| "maxpower_fetched": total_maxpower_fetched, | ||
| "maxpower_saved": total_maxpower_saved, | ||
| "maxpower_updated": total_maxpower_updated, | ||
| }, | ||
| "monthly_results": results, | ||
| } | ||
| _LOGGER.info( | ||
| f"Maxpower range update completed: {len(results)} months processed, " | ||
| f"{total_maxpower_fetched} maxpower readings fetched" | ||
| ) | ||
| return summary | ||
| async def get_stored_maxpower( | ||
| self, | ||
| cups: str, | ||
| start_date: Optional[datetime] = None, | ||
| end_date: Optional[datetime] = None, | ||
| ) -> List: | ||
| """Get stored maxpower readings from database. | ||
| Args: | ||
| cups: CUPS identifier | ||
| start_date: Optional start date filter | ||
| end_date: Optional end date filter | ||
| Returns: | ||
| List of MaxPower objects from database | ||
| """ | ||
| db_service = await self._get_db_service() | ||
| return await db_service.get_maxpower_readings(cups, start_date, end_date) | ||
| async def get_last_maxpower_date(self, cups: str) -> Optional[datetime]: | ||
| """Get the date of the last maxpower record in the database. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Datetime of last maxpower reading or None if no data exists | ||
| """ | ||
| db_service = await self._get_db_service() | ||
| latest_maxpower = await db_service.get_latest_maxpower(cups) | ||
| if latest_maxpower: | ||
| return latest_maxpower.datetime | ||
| return None | ||
| async def get_peak_power_for_period( | ||
| self, cups: str, start_date: datetime, end_date: datetime | ||
| ) -> Optional[MaxPower]: | ||
| """Get the peak power reading for a specific period. | ||
| Args: | ||
| cups: CUPS identifier | ||
| start_date: Start date for search | ||
| end_date: End date for search | ||
| Returns: | ||
| MaxPower object with highest value_kw in the period, or None if no data | ||
| """ | ||
| readings = await self.get_stored_maxpower(cups, start_date, end_date) | ||
| if not readings: | ||
| return None | ||
| # Find the reading with maximum power | ||
| peak_reading = max(readings, key=lambda r: r.value_kw) | ||
| return peak_reading | ||
| async def get_daily_peaks( | ||
| self, cups: str, start_date: datetime, end_date: datetime | ||
| ) -> Dict[str, MaxPower]: | ||
| """Get daily peak power readings for a date range. | ||
| Args: | ||
| cups: CUPS identifier | ||
| start_date: Start date | ||
| end_date: End date | ||
| Returns: | ||
| Dict with date strings as keys and MaxPower objects as values | ||
| """ | ||
| readings = await self.get_stored_maxpower(cups, start_date, end_date) | ||
| if not readings: | ||
| return {} | ||
| # Group by date and find peak for each day | ||
| daily_peaks = {} | ||
| for reading in readings: | ||
| date_key = reading.datetime.date().isoformat() | ||
| if ( | ||
| date_key not in daily_peaks | ||
| or reading.value_kw > daily_peaks[date_key].value_kw | ||
| ): | ||
| daily_peaks[date_key] = reading | ||
| return daily_peaks | ||
| async def get_maximeter_summary( | ||
| self, | ||
| cups: str, | ||
| start_date: Optional[datetime] = None, | ||
| end_date: Optional[datetime] = None, | ||
| ) -> Dict[str, Any]: | ||
| """Get maximeter summary data compatible with EdataHelper attributes. | ||
| Args: | ||
| cups: CUPS identifier | ||
| start_date: Optional start date filter | ||
| end_date: Optional end date filter | ||
| Returns: | ||
| Dict with summary attributes matching EdataHelper format | ||
| """ | ||
| maximeter_data = await self.get_stored_maxpower(cups, start_date, end_date) | ||
| if not maximeter_data: | ||
| return { | ||
| "max_power_kW": None, | ||
| "max_power_date": None, | ||
| "max_power_mean_kW": None, | ||
| "max_power_90perc_kW": None, | ||
| } | ||
| # Calculate summary statistics | ||
| power_values = [m.value_kw for m in maximeter_data] | ||
| max_power = max(power_values) | ||
| mean_power = sum(power_values) / len(power_values) | ||
| # Find date for max power | ||
| max_power_date = next( | ||
| m.datetime for m in maximeter_data if m.value_kw == max_power | ||
| ) | ||
| # Calculate 90th percentile | ||
| sorted_values = sorted(power_values) | ||
| n = len(sorted_values) | ||
| p90_index = int(0.9 * n) | ||
| p90_power = sorted_values[p90_index] if p90_index < n else sorted_values[-1] | ||
| return { | ||
| "max_power_kW": round(max_power, 2), | ||
| "max_power_date": max_power_date, | ||
| "max_power_mean_kW": round(mean_power, 2), | ||
| "max_power_90perc_kW": round(p90_power, 2), | ||
| } |
| """Supply service for fetching and managing supply data.""" | ||
| import logging | ||
| from datetime import datetime | ||
| from typing import Any, Dict, List, Optional | ||
| from edata.connectors.datadis import DatadisConnector | ||
| from edata.services.database import DatabaseService, SupplyModel, get_database_service | ||
| _LOGGER = logging.getLogger(__name__) | ||
| class SupplyService: | ||
| """Service for managing supply data fetching and storage.""" | ||
| def __init__( | ||
| self, | ||
| datadis_connector: DatadisConnector, | ||
| storage_dir: Optional[str] = None, | ||
| ): | ||
| """Initialize supply service. | ||
| Args: | ||
| datadis_connector: Configured Datadis connector instance | ||
| storage_dir: Directory for database and cache storage | ||
| """ | ||
| self._datadis = datadis_connector | ||
| self._storage_dir = storage_dir | ||
| self._db_service = None | ||
| async def _get_db_service(self) -> DatabaseService: | ||
| """Get database service, initializing if needed.""" | ||
| if self._db_service is None: | ||
| self._db_service = await get_database_service(self._storage_dir) | ||
| return self._db_service | ||
| async def update_supplies( | ||
| self, authorized_nif: Optional[str] = None | ||
| ) -> Dict[str, Any]: | ||
| """Update supply data from Datadis. | ||
| Args: | ||
| authorized_nif: Optional authorized NIF for access | ||
| Returns: | ||
| Dict with operation results and statistics | ||
| """ | ||
| _LOGGER.info("Updating supplies from Datadis") | ||
| try: | ||
| # Fetch supply data from Datadis | ||
| supplies_data = await self._datadis.get_supplies( | ||
| authorized_nif=authorized_nif | ||
| ) | ||
| if not supplies_data: | ||
| _LOGGER.warning("No supply data found") | ||
| return { | ||
| "success": True, | ||
| "stats": { | ||
| "fetched": 0, | ||
| "saved": 0, | ||
| "updated": 0, | ||
| "total_stored": 0, | ||
| }, | ||
| } | ||
| # Save supplies to database | ||
| saved_count = 0 | ||
| updated_count = 0 | ||
| db_service = await self._get_db_service() | ||
| for supply in supplies_data: | ||
| # Convert Pydantic model to dict for database storage | ||
| supply_dict = supply.model_dump() | ||
| # Check if supply already exists | ||
| existing = await db_service.get_supplies(cups=supply.cups) | ||
| if existing: | ||
| updated_count += 1 | ||
| _LOGGER.debug( | ||
| f"Updating existing supply for CUPS {supply.cups[-5:]}" | ||
| ) | ||
| else: | ||
| saved_count += 1 | ||
| _LOGGER.debug(f"Saving new supply for CUPS {supply.cups[-5:]}") | ||
| # Save to database | ||
| await db_service.save_supply(supply_dict) | ||
| # Get total supplies stored | ||
| all_supplies = await db_service.get_supplies() | ||
| total_stored = len(all_supplies) | ||
| result = { | ||
| "success": True, | ||
| "stats": { | ||
| "fetched": len(supplies_data), | ||
| "saved": saved_count, | ||
| "updated": updated_count, | ||
| "total_stored": total_stored, | ||
| }, | ||
| } | ||
| _LOGGER.info( | ||
| f"Supply update completed: " | ||
| f"{len(supplies_data)} fetched, {saved_count} saved, {updated_count} updated" | ||
| ) | ||
| return result | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error updating supplies: {str(e)}") | ||
| return { | ||
| "success": False, | ||
| "error": str(e), | ||
| "stats": {"fetched": 0, "saved": 0, "updated": 0, "total_stored": 0}, | ||
| } | ||
| async def get_supplies(self, cups: Optional[str] = None) -> List[SupplyModel]: | ||
| """Get stored supply data. | ||
| Args: | ||
| cups: Optional CUPS identifier filter | ||
| Returns: | ||
| List of Supply objects | ||
| """ | ||
| _LOGGER.debug(f"Getting supplies{f' for CUPS {cups[-5:]}' if cups else ''}") | ||
| try: | ||
| db_service = await self._get_db_service() | ||
| supplies = await db_service.get_supplies(cups=cups) | ||
| _LOGGER.debug(f"Found {len(supplies)} supplies") | ||
| return supplies | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting supplies: {str(e)}") | ||
| return [] | ||
| async def get_supply_by_cups(self, cups: str) -> Optional[SupplyModel]: | ||
| """Get a specific supply by CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Supply object if found, None otherwise | ||
| """ | ||
| _LOGGER.debug(f"Getting supply for CUPS {cups[-5:]}") | ||
| try: | ||
| db_service = await self._get_db_service() | ||
| supplies = await db_service.get_supplies(cups=cups) | ||
| if supplies: | ||
| _LOGGER.debug(f"Found supply for CUPS {cups[-5:]}") | ||
| return supplies[0] # Should be unique | ||
| _LOGGER.warning(f"No supply found for CUPS {cups[-5:]}") | ||
| return None | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting supply for CUPS {cups[-5:]}: {str(e)}") | ||
| return None | ||
| async def get_cups_list(self) -> List[str]: | ||
| """Get list of all stored CUPS. | ||
| Returns: | ||
| List of CUPS identifiers | ||
| """ | ||
| _LOGGER.debug("Getting CUPS list") | ||
| try: | ||
| db_service = await self._get_db_service() | ||
| supplies = await db_service.get_supplies() | ||
| cups_list = [supply.cups for supply in supplies if supply.cups] | ||
| _LOGGER.debug(f"Found {len(cups_list)} CUPS") | ||
| return cups_list | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting CUPS list: {str(e)}") | ||
| return [] | ||
| async def get_active_supplies( | ||
| self, reference_date: Optional[datetime] = None | ||
| ) -> List[SupplyModel]: | ||
| """Get supplies that are active at a given date. | ||
| Args: | ||
| reference_date: Date to check for active supplies (defaults to now) | ||
| Returns: | ||
| List of active supplies | ||
| """ | ||
| if reference_date is None: | ||
| reference_date = datetime.now() | ||
| _LOGGER.debug(f"Getting active supplies for date {reference_date.date()}") | ||
| try: | ||
| db_service = await self._get_db_service() | ||
| all_supplies = await db_service.get_supplies() | ||
| active_supplies = [] | ||
| for supply in all_supplies: | ||
| if supply.date_start <= reference_date <= supply.date_end: | ||
| active_supplies.append(supply) | ||
| _LOGGER.debug(f"Found {len(active_supplies)} active supplies") | ||
| return active_supplies | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting active supplies: {str(e)}") | ||
| return [] | ||
| async def get_supply_stats(self) -> Dict[str, Any]: | ||
| """Get statistics about stored supplies. | ||
| Returns: | ||
| Dict with supply statistics | ||
| """ | ||
| _LOGGER.debug("Calculating supply statistics") | ||
| try: | ||
| db_service = await self._get_db_service() | ||
| supplies = await db_service.get_supplies() | ||
| if not supplies: | ||
| return { | ||
| "total_supplies": 0, | ||
| "total_cups": 0, | ||
| "date_range": None, | ||
| "distributors": {}, | ||
| "point_types": {}, | ||
| } | ||
| # Calculate date range | ||
| earliest_start = min(s.date_start for s in supplies) | ||
| latest_end = max(s.date_end for s in supplies) | ||
| # Count by distributor | ||
| distributors = {} | ||
| # Count by point type | ||
| point_types = {} | ||
| for supply in supplies: | ||
| # Count distributors | ||
| dist = supply.distributor or "Unknown" | ||
| distributors[dist] = distributors.get(dist, 0) + 1 | ||
| # Count point types | ||
| pt = supply.point_type or "Unknown" | ||
| point_types[pt] = point_types.get(pt, 0) + 1 | ||
| stats = { | ||
| "total_supplies": len(supplies), | ||
| "total_cups": len(set(s.cups for s in supplies)), | ||
| "date_range": { | ||
| "earliest_start": earliest_start, | ||
| "latest_end": latest_end, | ||
| }, | ||
| "distributors": distributors, | ||
| "point_types": point_types, | ||
| } | ||
| _LOGGER.debug(f"Supply statistics: {len(supplies)} total supplies") | ||
| return stats | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error calculating supply statistics: {str(e)}") | ||
| return {} | ||
| async def validate_cups(self, cups: str) -> bool: | ||
| """Validate that a CUPS exists in stored supplies. | ||
| Args: | ||
| cups: CUPS identifier to validate | ||
| Returns: | ||
| True if CUPS exists, False otherwise | ||
| """ | ||
| _LOGGER.debug(f"Validating CUPS {cups[-5:]}") | ||
| try: | ||
| supply = await self.get_supply_by_cups(cups) | ||
| is_valid = supply is not None | ||
| if is_valid: | ||
| _LOGGER.debug(f"CUPS {cups[-5:]} is valid") | ||
| else: | ||
| _LOGGER.warning(f"CUPS {cups[-5:]} not found") | ||
| return is_valid | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error validating CUPS {cups[-5:]}: {str(e)}") | ||
| return False | ||
| async def get_distributor_code(self, cups: str) -> Optional[str]: | ||
| """Get distributor code for a CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Distributor code if found, None otherwise | ||
| """ | ||
| _LOGGER.debug(f"Getting distributor code for CUPS {cups[-5:]}") | ||
| try: | ||
| supply = await self.get_supply_by_cups(cups) | ||
| if supply and supply.distributor_code: | ||
| _LOGGER.debug( | ||
| f"Found distributor code {supply.distributor_code} for CUPS {cups[-5:]}" | ||
| ) | ||
| return supply.distributor_code | ||
| _LOGGER.warning(f"No distributor code found for CUPS {cups[-5:]}") | ||
| return None | ||
| except Exception as e: | ||
| _LOGGER.error( | ||
| f"Error getting distributor code for CUPS {cups[-5:]}: {str(e)}" | ||
| ) | ||
| return None | ||
| async def get_point_type(self, cups: str) -> Optional[int]: | ||
| """Get point type for a CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Point type if found, None otherwise | ||
| """ | ||
| _LOGGER.debug(f"Getting point type for CUPS {cups[-5:]}") | ||
| try: | ||
| supply = await self.get_supply_by_cups(cups) | ||
| if supply and supply.point_type is not None: | ||
| _LOGGER.debug( | ||
| f"Found point type {supply.point_type} for CUPS {cups[-5:]}" | ||
| ) | ||
| return supply.point_type | ||
| _LOGGER.warning(f"No point type found for CUPS {cups[-5:]}") | ||
| return None | ||
| except Exception as e: | ||
| _LOGGER.error(f"Error getting point type for CUPS {cups[-5:]}: {str(e)}") | ||
| return None | ||
| async def get_supply_summary(self, cups: str) -> Dict[str, Any]: | ||
| """Get supply summary attributes for a CUPS. | ||
| Args: | ||
| cups: CUPS identifier | ||
| Returns: | ||
| Dict with supply summary attributes | ||
| """ | ||
| _LOGGER.debug(f"Getting supply summary for CUPS {cups[-5:]}") | ||
| try: | ||
| supply = await self.get_supply_by_cups(cups) | ||
| if not supply: | ||
| _LOGGER.warning(f"No supply found for CUPS {cups[-5:]}") | ||
| return {"cups": None} | ||
| summary = { | ||
| "cups": supply.cups, | ||
| # Add other supply-related summary attributes here as needed | ||
| # These would be used by EdataHelper for calculating summary attributes | ||
| } | ||
| _LOGGER.debug(f"Supply summary calculated for CUPS {cups[-5:]}") | ||
| return summary | ||
| except Exception as e: | ||
| _LOGGER.error( | ||
| f"Error getting supply summary for CUPS {cups[-5:]}: {str(e)}" | ||
| ) | ||
| return {"cups": None} |
| """Datadis connector module testing.""" | ||
| import datetime | ||
| from unittest.mock import AsyncMock, patch | ||
| import pytest | ||
| from edata.connectors.datadis import DatadisConnector | ||
| MOCK_USERNAME = "fake_user" | ||
| MOCK_PASSWORD = "fake_password" | ||
| SUPPLIES_RESPONSE = [ | ||
| { | ||
| "cups": "ESXXXXXXXXXXXXXXXXTEST", | ||
| "validDateFrom": "2022/01/01", | ||
| "validDateTo": "", | ||
| "pointType": 5, | ||
| "distributorCode": "2", | ||
| "address": "fake address, fake 12345", | ||
| "postalCode": "12345", | ||
| "province": "FAKE PROVINCE", | ||
| "municipality": "FAKE MUNICIPALITY", | ||
| "distributor": "FAKE DISTRIBUTOR", | ||
| } | ||
| ] | ||
| SUPPLIES_EXPECTATIONS = [ | ||
| { | ||
| "cups": "ESXXXXXXXXXXXXXXXXTEST", | ||
| "date_start": datetime.datetime(2022, 1, 1, 0, 0), | ||
| "date_end": datetime.datetime.now().replace( | ||
| hour=0, minute=0, second=0, microsecond=0 | ||
| ) | ||
| + datetime.timedelta(days=1), | ||
| "point_type": 5, | ||
| "distributor_code": "2", | ||
| "address": "fake address, fake 12345", | ||
| "postal_code": "12345", | ||
| "province": "FAKE PROVINCE", | ||
| "municipality": "FAKE MUNICIPALITY", | ||
| "distributor": "FAKE DISTRIBUTOR", | ||
| } | ||
| ] | ||
| CONTRACTS_RESPONSE = [ | ||
| { | ||
| "startDate": "2022/10/22", | ||
| "endDate": "2022/10/22", | ||
| "marketer": "fake_marketer", | ||
| "contractedPowerkW": [1.5, 1.5], | ||
| } | ||
| ] | ||
| CONTRACTS_EXPECTATIONS = [ | ||
| { | ||
| "date_start": datetime.datetime(2022, 10, 22, 0, 0), | ||
| "date_end": datetime.datetime(2022, 10, 22, 0, 0), | ||
| "marketer": "fake_marketer", | ||
| "distributor_code": "2", | ||
| "power_p1": 1.5, | ||
| "power_p2": 1.5, | ||
| } | ||
| ] | ||
| CONSUMPTIONS_RESPONSE = [ | ||
| { | ||
| "cups": "ESXXXXXXXXXXXXXXXXTEST", | ||
| "date": "2022/10/22", | ||
| "time": "01:00", | ||
| "consumptionKWh": 1.0, | ||
| "obtainMethod": "Real", | ||
| }, | ||
| { | ||
| "cups": "ESXXXXXXXXXXXXXXXXTEST", | ||
| "date": "2022/10/22", | ||
| "time": "02:00", | ||
| "consumptionKWh": 1.0, | ||
| "obtainMethod": "Real", | ||
| }, | ||
| ] | ||
| CONSUMPTIONS_EXPECTATIONS = [ | ||
| { | ||
| "datetime": datetime.datetime(2022, 10, 22, 0, 0), | ||
| "delta_h": 1, | ||
| "value_kwh": 1.0, | ||
| "surplus_kwh": 0, | ||
| "real": True, | ||
| }, | ||
| { | ||
| "datetime": datetime.datetime(2022, 10, 22, 1, 0), | ||
| "delta_h": 1, | ||
| "value_kwh": 1.0, | ||
| "surplus_kwh": 0, | ||
| "real": True, | ||
| }, | ||
| ] | ||
| MAXIMETER_RESPONSE = [ | ||
| { | ||
| "cups": "ESXXXXXXXXXXXXXXXXTEST", | ||
| "date": "2022/03/01", | ||
| "time": "12:00", | ||
| "maxPower": 1.0, | ||
| } | ||
| ] | ||
| MAXIMETER_EXPECTATIONS = [ | ||
| { | ||
| "datetime": datetime.datetime(2022, 3, 1, 12, 0), | ||
| "value_kw": 1.0, | ||
| } | ||
| ] | ||
| # Tests for async methods (now the only methods available) | ||
| @pytest.mark.asyncio | ||
| @patch.object(DatadisConnector, "_get_token", AsyncMock(return_value=True)) | ||
| @patch.object(DatadisConnector, "_get", AsyncMock(return_value=SUPPLIES_RESPONSE)) | ||
| async def test_get_supplies(): | ||
| """Test a successful 'get_supplies' query.""" | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| result = await connector.get_supplies() | ||
| # Note: Now returns Pydantic models instead of dicts | ||
| # Convert to dicts for comparison with expectations | ||
| result_dicts = [supply.model_dump() for supply in result] | ||
| assert result_dicts == SUPPLIES_EXPECTATIONS | ||
| @pytest.mark.asyncio | ||
| @patch.object(DatadisConnector, "_get_token", AsyncMock(return_value=True)) | ||
| @patch.object(DatadisConnector, "_get", AsyncMock(return_value=CONTRACTS_RESPONSE)) | ||
| async def test_get_contract_detail(): | ||
| """Test a successful 'get_contract_detail' query.""" | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| result = await connector.get_contract_detail("ESXXXXXXXXXXXXXXXXTEST", "2") | ||
| # Note: Now returns Pydantic models instead of dicts | ||
| result_dicts = [contract.model_dump() for contract in result] | ||
| assert result_dicts == CONTRACTS_EXPECTATIONS | ||
| @pytest.mark.asyncio | ||
| @patch.object(DatadisConnector, "_get_token", AsyncMock(return_value=True)) | ||
| @patch.object(DatadisConnector, "_get", AsyncMock(return_value=CONSUMPTIONS_RESPONSE)) | ||
| async def test_get_consumption_data(): | ||
| """Test a successful 'get_consumption_data' query.""" | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| result = await connector.get_consumption_data( | ||
| "ESXXXXXXXXXXXXXXXXTEST", | ||
| "2", | ||
| datetime.datetime(2022, 10, 22, 0, 0, 0), | ||
| datetime.datetime(2022, 10, 22, 2, 0, 0), | ||
| "0", # measurement_type as string | ||
| 5, | ||
| ) | ||
| # Note: Now returns Pydantic models instead of dicts | ||
| result_dicts = [consumption.model_dump() for consumption in result] | ||
| assert result_dicts == CONSUMPTIONS_EXPECTATIONS | ||
| @pytest.mark.asyncio | ||
| @patch.object(DatadisConnector, "_get_token", AsyncMock(return_value=True)) | ||
| @patch.object(DatadisConnector, "_get", AsyncMock(return_value=MAXIMETER_RESPONSE)) | ||
| async def test_get_max_power(): | ||
| """Test a successful 'get_max_power' query.""" | ||
| connector = DatadisConnector(MOCK_USERNAME, MOCK_PASSWORD) | ||
| result = await connector.get_max_power( | ||
| "ESXXXXXXXXXXXXXXXXTEST", | ||
| "2", | ||
| datetime.datetime(2022, 3, 1, 0, 0, 0), | ||
| datetime.datetime(2022, 4, 1, 0, 0, 0), | ||
| ) | ||
| # Note: Now returns Pydantic models instead of dicts | ||
| result_dicts = [maxpower.model_dump() for maxpower in result] | ||
| assert result_dicts == MAXIMETER_EXPECTATIONS |
| """Tests for REData (online)""" | ||
| from datetime import datetime, timedelta | ||
| import pytest | ||
| from edata.connectors.redata import REDataConnector | ||
| @pytest.mark.asyncio | ||
| async def test_get_realtime_prices(): | ||
| """Test a successful 'get_realtime_prices' query""" | ||
| connector = REDataConnector() | ||
| yesterday = datetime.now().replace(hour=0, minute=0, second=0) - timedelta(days=1) | ||
| response = await connector.get_realtime_prices( | ||
| yesterday, yesterday + timedelta(days=1) - timedelta(minutes=1), False | ||
| ) | ||
| assert len(response) == 24 | ||
| @pytest.mark.asyncio | ||
| async def test_async_get_realtime_prices(): | ||
| """Test a successful 'get_realtime_prices' query (legacy test name)""" | ||
| connector = REDataConnector() | ||
| yesterday = datetime.now().replace(hour=0, minute=0, second=0) - timedelta(days=1) | ||
| response = await connector.get_realtime_prices( | ||
| yesterday, yesterday + timedelta(days=1) - timedelta(minutes=1), False | ||
| ) | ||
| assert len(response) == 24 |
| """Tests for BillingService.""" | ||
| import shutil | ||
| import tempfile | ||
| from datetime import datetime, timedelta | ||
| from unittest.mock import AsyncMock, Mock, patch | ||
| import pytest | ||
| import pytest_asyncio | ||
| from edata.models.pricing import PricingData, PricingRules | ||
| from edata.services.billing import BillingService | ||
| class TestBillingService: | ||
| """Test suite for BillingService.""" | ||
| @pytest.fixture | ||
| def temp_dir(self): | ||
| """Create a temporary directory for tests.""" | ||
| temp_dir = tempfile.mkdtemp() | ||
| yield temp_dir | ||
| shutil.rmtree(temp_dir) | ||
| @pytest.fixture | ||
| def mock_redata_connector(self): | ||
| """Mock REDataConnector for testing.""" | ||
| with patch("edata.services.billing.REDataConnector") as mock_connector_class: | ||
| mock_connector = Mock() | ||
| mock_connector_class.return_value = mock_connector | ||
| yield mock_connector, mock_connector_class | ||
| @pytest.fixture | ||
| def mock_database_service(self): | ||
| """Mock DatabaseService for testing.""" | ||
| with patch("edata.services.billing.get_database_service") as mock_get_db: | ||
| mock_db = Mock() | ||
| # Hacer que los métodos async retornen AsyncMock | ||
| mock_db.get_pvpc_prices = AsyncMock(return_value=[]) | ||
| mock_db.save_pvpc_price = AsyncMock(return_value=Mock()) | ||
| mock_db.get_billing = AsyncMock(return_value=[]) | ||
| mock_db.save_billing = AsyncMock(return_value=Mock()) | ||
| mock_db.get_consumptions = AsyncMock(return_value=[]) | ||
| mock_db.get_contracts = AsyncMock(return_value=[]) | ||
| mock_db.generate_pricing_config_hash = Mock(return_value="test_hash") | ||
| mock_db.get_latest_pvpc_price = AsyncMock(return_value=None) | ||
| mock_db.get_latest_billing = AsyncMock(return_value=None) | ||
| mock_get_db.return_value = mock_db | ||
| yield mock_db | ||
| @pytest_asyncio.fixture | ||
| async def billing_service( | ||
| self, temp_dir, mock_redata_connector, mock_database_service | ||
| ): | ||
| """Create a BillingService instance for testing.""" | ||
| return BillingService(storage_dir=temp_dir) | ||
| @pytest.fixture | ||
| def sample_pvpc_prices(self): | ||
| """Sample PVPC price data for testing.""" | ||
| return [ | ||
| PricingData( | ||
| datetime=datetime(2024, 6, 17, 10, 0), | ||
| value_eur_kwh=0.12345, | ||
| delta_h=1.0, | ||
| ), | ||
| PricingData( | ||
| datetime=datetime(2024, 6, 17, 11, 0), | ||
| value_eur_kwh=0.13456, | ||
| delta_h=1.0, | ||
| ), | ||
| PricingData( | ||
| datetime=datetime(2024, 6, 17, 12, 0), | ||
| value_eur_kwh=0.14567, | ||
| delta_h=1.0, | ||
| ), | ||
| ] | ||
| @pytest.fixture | ||
| def sample_pricing_rules_pvpc(self): | ||
| """Sample pricing rules for PVPC configuration.""" | ||
| return PricingRules( | ||
| p1_kw_year_eur=30.67, | ||
| p2_kw_year_eur=1.42, | ||
| p1_kwh_eur=None, # PVPC | ||
| p2_kwh_eur=None, # PVPC | ||
| p3_kwh_eur=None, # PVPC | ||
| surplus_p1_kwh_eur=0.05, | ||
| surplus_p2_kwh_eur=0.04, | ||
| surplus_p3_kwh_eur=0.03, | ||
| meter_month_eur=0.81, | ||
| market_kw_year_eur=3.11, | ||
| electricity_tax=1.05113, | ||
| iva_tax=1.21, | ||
| energy_formula="electricity_tax * iva_tax * kwh_eur * kwh", | ||
| power_formula="electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| others_formula="iva_tax * meter_month_eur / 30 / 24", | ||
| surplus_formula="electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| main_formula="energy_term + power_term + others_term", | ||
| ) | ||
| @pytest.fixture | ||
| def sample_pricing_rules_custom(self): | ||
| """Sample pricing rules for custom pricing configuration.""" | ||
| return PricingRules( | ||
| p1_kw_year_eur=30.67, | ||
| p2_kw_year_eur=1.42, | ||
| p1_kwh_eur=0.15, # Custom prices | ||
| p2_kwh_eur=0.12, | ||
| p3_kwh_eur=0.08, | ||
| surplus_p1_kwh_eur=0.05, | ||
| surplus_p2_kwh_eur=0.04, | ||
| surplus_p3_kwh_eur=0.03, | ||
| meter_month_eur=0.81, | ||
| market_kw_year_eur=3.11, | ||
| electricity_tax=1.05113, | ||
| iva_tax=1.21, | ||
| energy_formula="electricity_tax * iva_tax * kwh_eur * kwh", | ||
| power_formula="electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| others_formula="iva_tax * meter_month_eur / 30 / 24", | ||
| surplus_formula="electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| main_formula="energy_term + power_term + others_term", | ||
| ) | ||
| @pytest.mark.asyncio | ||
| async def test_initialization( | ||
| self, temp_dir, mock_redata_connector, mock_database_service | ||
| ): | ||
| """Test BillingService initialization.""" | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| service = BillingService(storage_dir=temp_dir) | ||
| # Verify REDataConnector was initialized | ||
| mock_connector_class.assert_called_once() | ||
| # Verify database service is obtained lazily by calling _get_db_service | ||
| db_service = await service._get_db_service() | ||
| assert db_service is mock_database_service | ||
| @pytest.mark.asyncio | ||
| async def test_update_pvpc_prices_success( | ||
| self, | ||
| billing_service, | ||
| mock_redata_connector, | ||
| mock_database_service, | ||
| sample_pvpc_prices, | ||
| ): | ||
| """Test successful PVPC price update.""" | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| start_date = datetime(2024, 6, 17, 0, 0) | ||
| end_date = datetime(2024, 6, 17, 23, 59) | ||
| # Mock REData connector response | ||
| mock_connector.get_realtime_prices = AsyncMock(return_value=sample_pvpc_prices) | ||
| # Mock database service responses - no existing prices | ||
| mock_database_service.get_pvpc_prices.return_value = [] | ||
| # Execute PVPC update | ||
| result = await billing_service.update_pvpc_prices( | ||
| start_date=start_date, end_date=end_date, is_ceuta_melilla=False | ||
| ) | ||
| # Verify REData connector was called correctly | ||
| mock_connector.get_realtime_prices.assert_called_once_with( | ||
| dt_from=start_date, dt_to=end_date, is_ceuta_melilla=False | ||
| ) | ||
| # Verify database service was called for each price | ||
| assert mock_database_service.save_pvpc_price.call_count == len( | ||
| sample_pvpc_prices | ||
| ) | ||
| # Verify result structure | ||
| assert result["success"] is True | ||
| assert result["region"] == "Peninsula" | ||
| assert result["geo_id"] == 8741 | ||
| assert result["stats"]["fetched"] == len(sample_pvpc_prices) | ||
| assert result["stats"]["saved"] == len(sample_pvpc_prices) | ||
| assert result["stats"]["updated"] == 0 | ||
| @patch("edata.utils.get_pvpc_tariff") | ||
| def test_get_custom_prices_success( | ||
| self, | ||
| mock_get_pvpc_tariff, | ||
| billing_service, | ||
| mock_redata_connector, | ||
| mock_database_service, | ||
| sample_pricing_rules_custom, | ||
| ): | ||
| """Test successful custom price calculation.""" | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| start_date = datetime(2024, 6, 17, 10, 0) # Monday 10 AM | ||
| end_date = datetime(2024, 6, 17, 13, 0) # Monday 1 PM (3 hours) | ||
| # Mock tariff calculation to cycle through periods | ||
| mock_get_pvpc_tariff.side_effect = ["p1", "p1", "p1"] # All P1 hours | ||
| # Execute custom price calculation (not async) | ||
| result = billing_service.get_custom_prices( | ||
| pricing_rules=sample_pricing_rules_custom, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify tariff function was called for each hour | ||
| assert mock_get_pvpc_tariff.call_count == 3 | ||
| # Verify result structure | ||
| assert len(result) == 3 | ||
| assert all(isinstance(price, PricingData) for price in result) | ||
| assert all( | ||
| price.value_eur_kwh == sample_pricing_rules_custom.p1_kwh_eur | ||
| for price in result | ||
| ) | ||
| @pytest.mark.asyncio | ||
| async def test_get_stored_pvpc_prices( | ||
| self, billing_service, mock_redata_connector, mock_database_service | ||
| ): | ||
| """Test getting stored PVPC prices from database.""" | ||
| start_date = datetime(2024, 6, 17, 0, 0) | ||
| end_date = datetime(2024, 6, 17, 23, 59) | ||
| geo_id = 8741 | ||
| # Mock database service response | ||
| mock_prices = [Mock(), Mock(), Mock()] | ||
| mock_database_service.get_pvpc_prices.return_value = mock_prices | ||
| # Execute get stored prices | ||
| result = await billing_service.get_stored_pvpc_prices( | ||
| start_date=start_date, end_date=end_date, geo_id=geo_id | ||
| ) | ||
| # Verify database service was called correctly | ||
| mock_database_service.get_pvpc_prices.assert_called_once_with( | ||
| start_date, end_date, geo_id | ||
| ) | ||
| # Verify result | ||
| assert result == mock_prices | ||
| @pytest.mark.asyncio | ||
| async def test_get_prices_pvpc( | ||
| self, | ||
| billing_service, | ||
| mock_redata_connector, | ||
| mock_database_service, | ||
| sample_pricing_rules_pvpc, | ||
| ): | ||
| """Test automatic price retrieval with PVPC configuration.""" | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| start_date = datetime(2024, 6, 17, 0, 0) | ||
| end_date = datetime(2024, 6, 17, 23, 59) | ||
| # Mock stored PVPC prices | ||
| mock_pvpc_prices = [ | ||
| Mock( | ||
| datetime=datetime(2024, 6, 17, 10, 0), value_eur_kwh=0.15, delta_h=1.0 | ||
| ), | ||
| Mock( | ||
| datetime=datetime(2024, 6, 17, 11, 0), value_eur_kwh=0.16, delta_h=1.0 | ||
| ), | ||
| ] | ||
| mock_database_service.get_pvpc_prices.return_value = mock_pvpc_prices | ||
| # Execute automatic price retrieval with PVPC rules | ||
| result = await billing_service.get_prices( | ||
| pricing_rules=sample_pricing_rules_pvpc, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| is_ceuta_melilla=False, | ||
| ) | ||
| # Should call PVPC retrieval | ||
| mock_database_service.get_pvpc_prices.assert_called_once() | ||
| assert len(result) == 2 | ||
| assert all(isinstance(price, PricingData) for price in result) | ||
| @patch("edata.utils.get_pvpc_tariff") | ||
| @pytest.mark.asyncio | ||
| async def test_get_prices_custom( | ||
| self, | ||
| mock_get_pvpc_tariff, | ||
| billing_service, | ||
| mock_redata_connector, | ||
| mock_database_service, | ||
| sample_pricing_rules_custom, | ||
| ): | ||
| """Test automatic price retrieval with custom configuration.""" | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| start_date = datetime(2024, 6, 17, 10, 0) | ||
| end_date = datetime(2024, 6, 17, 11, 0) | ||
| # Mock tariff calculation | ||
| mock_get_pvpc_tariff.return_value = "p1" | ||
| # Execute automatic price retrieval with custom rules | ||
| result = await billing_service.get_prices( | ||
| pricing_rules=sample_pricing_rules_custom, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Should call custom calculation (not database) | ||
| mock_database_service.get_pvpc_prices.assert_not_called() | ||
| assert len(result) == 1 | ||
| assert isinstance(result[0], PricingData) | ||
| assert result[0].value_eur_kwh == sample_pricing_rules_custom.p1_kwh_eur | ||
| @pytest.mark.asyncio | ||
| async def test_get_prices_pvpc_no_data( | ||
| self, | ||
| billing_service, | ||
| mock_redata_connector, | ||
| mock_database_service, | ||
| sample_pricing_rules_pvpc, | ||
| ): | ||
| """Test automatic price retrieval with PVPC configuration but no data.""" | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| start_date = datetime(2024, 6, 17, 0, 0) | ||
| end_date = datetime(2024, 6, 17, 23, 59) | ||
| # Mock no PVPC prices available | ||
| mock_database_service.get_pvpc_prices.return_value = [] | ||
| # Execute automatic price retrieval with PVPC rules | ||
| result = await billing_service.get_prices( | ||
| pricing_rules=sample_pricing_rules_pvpc, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| is_ceuta_melilla=False, | ||
| ) | ||
| # Should return None when no data available | ||
| assert result is None | ||
| mock_database_service.get_pvpc_prices.assert_called_once() | ||
| @pytest.mark.asyncio | ||
| async def test_get_prices_custom_no_prices_defined( | ||
| self, billing_service, mock_redata_connector, mock_database_service | ||
| ): | ||
| """Test automatic price retrieval with custom configuration but no prices defined.""" | ||
| from edata.models.pricing import PricingRules | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| start_date = datetime(2024, 6, 17, 10, 0) | ||
| end_date = datetime(2024, 6, 17, 11, 0) | ||
| # Create pricing rules with no energy prices defined | ||
| empty_pricing_rules = PricingRules( | ||
| p1_kw_year_eur=30.67, | ||
| p2_kw_year_eur=1.42, | ||
| p1_kwh_eur=None, # No custom prices | ||
| p2_kwh_eur=None, | ||
| p3_kwh_eur=None, | ||
| surplus_p1_kwh_eur=0.05, | ||
| surplus_p2_kwh_eur=0.04, | ||
| surplus_p3_kwh_eur=0.03, | ||
| meter_month_eur=0.81, | ||
| market_kw_year_eur=3.11, | ||
| electricity_tax=1.05113, | ||
| iva_tax=1.21, | ||
| energy_formula="electricity_tax * iva_tax * kwh_eur * kwh", | ||
| power_formula="electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| others_formula="iva_tax * meter_month_eur / 30 / 24", | ||
| surplus_formula="electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| main_formula="energy_term + power_term + others_term", | ||
| ) | ||
| # Mock empty PVPC prices since rules indicate PVPC usage | ||
| mock_database_service.get_pvpc_prices.return_value = [] | ||
| # Execute automatic price retrieval with empty custom rules | ||
| result = await billing_service.get_prices( | ||
| pricing_rules=empty_pricing_rules, start_date=start_date, end_date=end_date | ||
| ) | ||
| # Should return None when no PVPC prices available | ||
| assert result is None | ||
| # Should have tried to get PVPC prices since no custom prices are defined | ||
| mock_database_service.get_pvpc_prices.assert_called_once_with( | ||
| start_date, end_date, 8741 | ||
| ) | ||
| @pytest.mark.asyncio | ||
| @patch("edata.utils.get_pvpc_tariff") | ||
| async def test_get_cost_calculation( | ||
| self, | ||
| mock_get_pvpc_tariff, | ||
| billing_service, | ||
| mock_redata_connector, | ||
| mock_database_service, | ||
| sample_pricing_rules_custom, | ||
| ): | ||
| """Test cost calculation functionality.""" | ||
| from datetime import datetime | ||
| from edata.models.pricing import PricingAggregated | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| cups = "ES0123456789012345AB" | ||
| start_date = datetime(2024, 6, 17, 10, 0) | ||
| end_date = datetime(2024, 6, 17, 12, 0) # 2 hours | ||
| # Mock no existing billing data initially | ||
| mock_database_service.get_billing.return_value = [] | ||
| # Mock the pricing config hash generation | ||
| mock_database_service.generate_pricing_config_hash.return_value = ( | ||
| "test_hash_12345678" | ||
| ) | ||
| # Mock consumption data | ||
| mock_consumptions = [ | ||
| type( | ||
| "MockConsumption", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 10, 0), | ||
| "value_kwh": 0.5, | ||
| "surplus_kwh": 0.0, | ||
| }, | ||
| )(), | ||
| type( | ||
| "MockConsumption", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 11, 0), | ||
| "value_kwh": 0.6, | ||
| "surplus_kwh": 0.0, | ||
| }, | ||
| )(), | ||
| ] | ||
| mock_database_service.get_consumptions.return_value = mock_consumptions | ||
| # Mock contract data | ||
| mock_contracts = [ | ||
| type( | ||
| "MockContract", | ||
| (), | ||
| { | ||
| "power_p1": 4.0, | ||
| "power_p2": 4.0, | ||
| "date_start": datetime(2024, 6, 17, 0, 0), | ||
| "date_end": datetime(2024, 6, 18, 0, 0), | ||
| }, | ||
| )() | ||
| ] | ||
| mock_database_service.get_contracts.return_value = mock_contracts | ||
| # Mock the save_billing method to return a success response | ||
| mock_database_service.save_billing.return_value = type("MockBilling", (), {})() | ||
| # Mock billing data after calculation | ||
| mock_billing_results = [ | ||
| type( | ||
| "MockBilling", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 10, 0), | ||
| "total_eur": 0.05, | ||
| "energy_term": 0.03, | ||
| "power_term": 0.015, | ||
| "others_term": 0.005, | ||
| "surplus_term": 0.0, | ||
| }, | ||
| )(), | ||
| type( | ||
| "MockBilling", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 11, 0), | ||
| "total_eur": 0.06, | ||
| "energy_term": 0.036, | ||
| "power_term": 0.015, | ||
| "others_term": 0.005, | ||
| "surplus_term": 0.0, | ||
| }, | ||
| )(), | ||
| ] | ||
| # Configure get_billing to return empty first, then billing results after update_missing_costs | ||
| mock_database_service.get_billing.side_effect = [ | ||
| [], | ||
| mock_billing_results, | ||
| mock_billing_results, | ||
| ] | ||
| # Mock tariff calculation - need one call per hour in the data | ||
| mock_get_pvpc_tariff.return_value = ( | ||
| "p1" # Use return_value instead of side_effect | ||
| ) | ||
| # Execute cost calculation | ||
| result = await billing_service.get_cost( | ||
| cups=cups, | ||
| pricing_rules=sample_pricing_rules_custom, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Validate result aggregation from mocked billing data | ||
| assert isinstance(result, PricingAggregated) | ||
| assert result.datetime == start_date | ||
| assert result.value_eur == 0.11 # 0.05 + 0.06 | ||
| assert result.energy_term == 0.066 # 0.03 + 0.036 | ||
| assert result.power_term == 0.03 # 0.015 + 0.015 | ||
| assert result.others_term == 0.01 # 0.005 + 0.005 | ||
| assert result.surplus_term == 0.0 | ||
| assert result.delta_h == 2 # 2 billing records | ||
| # Verify database calls | ||
| mock_database_service.get_consumptions.assert_called_once_with( | ||
| cups, start_date, end_date | ||
| ) | ||
| mock_database_service.get_contracts.assert_called_once_with(cups) | ||
| @pytest.mark.asyncio | ||
| async def test_get_cost_no_consumption_data( | ||
| self, | ||
| billing_service, | ||
| mock_redata_connector, | ||
| mock_database_service, | ||
| sample_pricing_rules_custom, | ||
| ): | ||
| """Test cost calculation with no consumption data.""" | ||
| from datetime import datetime | ||
| from edata.models.pricing import PricingAggregated | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| cups = "ES0123456789012345AB" | ||
| start_date = datetime(2024, 6, 17, 10, 0) | ||
| end_date = datetime(2024, 6, 17, 12, 0) | ||
| # Mock no existing billing data initially | ||
| mock_database_service.get_billing.return_value = [] | ||
| # Mock the pricing config hash generation | ||
| mock_database_service.generate_pricing_config_hash.return_value = ( | ||
| "test_hash_12345678" | ||
| ) | ||
| # Mock no consumption data | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute cost calculation | ||
| result = await billing_service.get_cost( | ||
| cups=cups, | ||
| pricing_rules=sample_pricing_rules_custom, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Should return default values when update_missing_costs fails | ||
| assert isinstance(result, PricingAggregated) | ||
| assert result.value_eur == 0.0 | ||
| assert result.energy_term == 0.0 | ||
| assert result.power_term == 0.0 | ||
| assert result.others_term == 0.0 | ||
| assert result.surplus_term == 0.0 | ||
| assert result.delta_h == 2.0 # (end_date - start_date).total_seconds() / 3600 | ||
| @pytest.mark.asyncio | ||
| async def test_get_cost_no_pricing_data( | ||
| self, billing_service, mock_redata_connector, mock_database_service | ||
| ): | ||
| """Test cost calculation with no pricing data available.""" | ||
| from datetime import datetime | ||
| from edata.models.pricing import PricingAggregated, PricingRules | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| cups = "ES0123456789012345AB" | ||
| start_date = datetime(2024, 6, 17, 10, 0) | ||
| end_date = datetime(2024, 6, 17, 12, 0) | ||
| # Mock no existing billing data initially | ||
| mock_database_service.get_billing.return_value = [] | ||
| # Mock the pricing config hash generation | ||
| mock_database_service.generate_pricing_config_hash.return_value = ( | ||
| "test_hash_12345678" | ||
| ) | ||
| # Mock consumption data present | ||
| mock_consumptions = [ | ||
| type( | ||
| "MockConsumption", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 10, 0), | ||
| "value_kwh": 0.5, | ||
| "surplus_kwh": 0.0, | ||
| }, | ||
| )() | ||
| ] | ||
| mock_database_service.get_consumptions.return_value = mock_consumptions | ||
| # Mock contract data present | ||
| mock_contracts = [ | ||
| type( | ||
| "MockContract", | ||
| (), | ||
| { | ||
| "power_p1": 4.0, | ||
| "power_p2": 4.0, | ||
| "date_start": datetime(2024, 6, 17, 0, 0), | ||
| "date_end": datetime(2024, 6, 18, 0, 0), | ||
| }, | ||
| )() | ||
| ] | ||
| mock_database_service.get_contracts.return_value = mock_contracts | ||
| # Mock no PVPC prices available | ||
| mock_database_service.get_pvpc_prices.return_value = [] | ||
| # Create PVPC pricing rules | ||
| pvpc_pricing_rules = PricingRules( | ||
| p1_kw_year_eur=30.67, | ||
| p2_kw_year_eur=1.42, | ||
| p1_kwh_eur=None, # PVPC | ||
| p2_kwh_eur=None, | ||
| p3_kwh_eur=None, | ||
| surplus_p1_kwh_eur=0.05, | ||
| surplus_p2_kwh_eur=0.04, | ||
| surplus_p3_kwh_eur=0.03, | ||
| meter_month_eur=0.81, | ||
| market_kw_year_eur=3.11, | ||
| electricity_tax=1.05113, | ||
| iva_tax=1.21, | ||
| energy_formula="electricity_tax * iva_tax * kwh_eur * kwh", | ||
| power_formula="electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| others_formula="iva_tax * meter_month_eur / 30 / 24", | ||
| surplus_formula="electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| main_formula="energy_term + power_term + others_term", | ||
| ) | ||
| # Execute cost calculation | ||
| result = await billing_service.get_cost( | ||
| cups=cups, | ||
| pricing_rules=pvpc_pricing_rules, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify result when no pricing data available | ||
| assert isinstance(result, PricingAggregated) | ||
| assert result.datetime == start_date | ||
| assert result.value_eur == 0.0 | ||
| assert result.energy_term == 0.0 | ||
| assert result.power_term == 0.0 | ||
| assert result.others_term == 0.0 | ||
| assert result.surplus_term == 0.0 | ||
| @pytest.mark.asyncio | ||
| async def test_jinja2_formula_evaluation( | ||
| self, billing_service, mock_redata_connector, mock_database_service | ||
| ): | ||
| """Test Jinja2 formula evaluation with predictable values.""" | ||
| from datetime import datetime | ||
| from edata.models.pricing import PricingAggregated, PricingRules | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| cups = "ES0123456789012345AB" | ||
| start_date = datetime(2024, 6, 17, 10, 0) # P1 period (Monday 10:00) | ||
| end_date = datetime(2024, 6, 17, 11, 0) # 1 hour | ||
| # Mock no existing billing data initially | ||
| mock_database_service.get_billing.return_value = [] | ||
| # Mock the pricing config hash generation | ||
| mock_database_service.generate_pricing_config_hash.return_value = ( | ||
| "test_hash_12345678" | ||
| ) | ||
| # Mock predictable consumption data: 1 kWh consumed, 0.5 kWh surplus | ||
| mock_consumptions = [ | ||
| type( | ||
| "MockConsumption", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 10, 0), | ||
| "value_kwh": 1.0, | ||
| "surplus_kwh": 0.5, | ||
| }, | ||
| )() | ||
| ] | ||
| mock_database_service.get_consumptions.return_value = mock_consumptions | ||
| # Mock predictable contract data: 5kW P1, 3kW P2 | ||
| mock_contracts = [ | ||
| type( | ||
| "MockContract", | ||
| (), | ||
| { | ||
| "power_p1": 5.0, | ||
| "power_p2": 3.0, | ||
| "date_start": datetime(2024, 6, 17, 0, 0), | ||
| "date_end": datetime(2024, 6, 18, 0, 0), | ||
| }, | ||
| )() | ||
| ] | ||
| mock_database_service.get_contracts.return_value = mock_contracts | ||
| # Mock predictable PVPC prices: 0.10 €/kWh | ||
| mock_pvpc_prices = [ | ||
| type( | ||
| "MockPVPCPrice", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 10, 0), | ||
| "value_eur_kwh": 0.10, | ||
| "delta_h": 1.0, | ||
| }, | ||
| )() | ||
| ] | ||
| mock_database_service.get_pvpc_prices.return_value = mock_pvpc_prices | ||
| # Mock the save_billing method to return a success response | ||
| mock_database_service.save_billing.return_value = type("MockBilling", (), {})() | ||
| # Mock billing result after calculation (with predictable values) | ||
| # Energy term: 1.05 * 1.21 * 0.10 * 1.0 = 0.12705 | ||
| expected_energy_term = 1.05 * 1.21 * 0.10 * 1.0 | ||
| # Power term: 1.05 * 1.21 * (5 * (40 + 4) + 3 * 20) / 365 / 24 | ||
| expected_power_term = 1.05 * 1.21 * (5 * (40 + 4) + 3 * 20) / 365 / 24 | ||
| # Others term: 1.21 * 3.0 / 30 / 24 | ||
| expected_others_term = 1.21 * 3.0 / 30 / 24 | ||
| # Surplus term: 1.05 * 1.21 * 0.5 * 0.06 | ||
| expected_surplus_term = 1.05 * 1.21 * 0.5 * 0.06 | ||
| # Total: energy + power + others - surplus | ||
| expected_total = ( | ||
| expected_energy_term | ||
| + expected_power_term | ||
| + expected_others_term | ||
| - expected_surplus_term | ||
| ) | ||
| mock_billing_result = [ | ||
| type( | ||
| "MockBilling", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 10, 0), | ||
| "total_eur": expected_total, | ||
| "energy_term": expected_energy_term, | ||
| "power_term": expected_power_term, | ||
| "others_term": expected_others_term, | ||
| "surplus_term": expected_surplus_term, | ||
| }, | ||
| )() | ||
| ] | ||
| # Configure get_billing to return empty first, then billing results after update_missing_costs | ||
| mock_database_service.get_billing.side_effect = [ | ||
| [], | ||
| mock_billing_result, | ||
| mock_billing_result, | ||
| ] | ||
| # Create PVPC pricing rules with simplified formulas for testing | ||
| test_pricing_rules = PricingRules( | ||
| p1_kw_year_eur=40.0, # €40/kW/year | ||
| p2_kw_year_eur=20.0, # €20/kW/year | ||
| p1_kwh_eur=None, # Use PVPC (0.10 €/kWh) | ||
| p2_kwh_eur=None, | ||
| p3_kwh_eur=None, | ||
| surplus_p1_kwh_eur=0.06, # €0.06/kWh surplus in P1 | ||
| surplus_p2_kwh_eur=0.04, # €0.04/kWh surplus in P2 | ||
| surplus_p3_kwh_eur=0.02, # €0.02/kWh surplus in P3 | ||
| meter_month_eur=3.0, # €3/month meter | ||
| market_kw_year_eur=4.0, # €4/kW/year market | ||
| electricity_tax=1.05, # 5% electricity tax | ||
| iva_tax=1.21, # 21% IVA | ||
| # Simplified formulas for predictable calculation | ||
| energy_formula="electricity_tax * iva_tax * kwh_eur * kwh", | ||
| power_formula="electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| others_formula="iva_tax * meter_month_eur / 30 / 24", | ||
| surplus_formula="electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| main_formula="energy_term + power_term + others_term - surplus_term", | ||
| ) | ||
| # Execute cost calculation | ||
| result = await billing_service.get_cost( | ||
| cups=cups, | ||
| pricing_rules=test_pricing_rules, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify the result matches our mocked billing data | ||
| assert isinstance(result, PricingAggregated) | ||
| assert result.datetime == start_date | ||
| assert result.delta_h == 1 # 1 billing record | ||
| # Verify that the aggregated values match our expected calculations | ||
| assert round(result.energy_term, 4) == round(expected_energy_term, 4) | ||
| assert round(result.power_term, 4) == round(expected_power_term, 4) | ||
| assert round(result.others_term, 4) == round(expected_others_term, 4) | ||
| assert round(result.surplus_term, 4) == round(expected_surplus_term, 4) | ||
| assert round(result.value_eur, 4) == round(expected_total, 4) | ||
| assert round(result.value_eur, 5) == round(expected_total, 5) | ||
| @pytest.mark.asyncio | ||
| async def test_update_missing_costs( | ||
| self, billing_service, mock_redata_connector, mock_database_service | ||
| ): | ||
| """Test update_missing_costs method.""" | ||
| from datetime import datetime | ||
| from edata.models.pricing import PricingRules | ||
| mock_connector, mock_connector_class = mock_redata_connector | ||
| cups = "ES0123456789012345AB" | ||
| start_date = datetime(2024, 6, 17, 10, 0) | ||
| end_date = datetime(2024, 6, 17, 12, 0) | ||
| # Mock consumption data | ||
| mock_consumptions = [ | ||
| type( | ||
| "MockConsumption", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 10, 0), | ||
| "value_kwh": 0.5, | ||
| "surplus_kwh": 0.0, | ||
| }, | ||
| )(), | ||
| type( | ||
| "MockConsumption", | ||
| (), | ||
| { | ||
| "datetime": datetime(2024, 6, 17, 11, 0), | ||
| "value_kwh": 0.7, | ||
| "surplus_kwh": 0.1, | ||
| }, | ||
| )(), | ||
| ] | ||
| mock_database_service.get_consumptions.return_value = mock_consumptions | ||
| # Mock contract data | ||
| mock_contracts = [ | ||
| type( | ||
| "MockContract", | ||
| (), | ||
| { | ||
| "power_p1": 4.0, | ||
| "power_p2": 4.0, | ||
| "date_start": datetime(2024, 6, 17, 0, 0), | ||
| "date_end": datetime(2024, 6, 18, 0, 0), | ||
| }, | ||
| )() | ||
| ] | ||
| mock_database_service.get_contracts.return_value = mock_contracts | ||
| # Mock no existing billing records | ||
| mock_database_service.get_billing.return_value = [] | ||
| # Mock successful billing save | ||
| mock_billing_record = type("MockBilling", (), {"id": 1})() | ||
| mock_database_service.save_billing.return_value = mock_billing_record | ||
| # Mock hash generation | ||
| mock_database_service.generate_pricing_config_hash.return_value = ( | ||
| "test_hash_123" | ||
| ) | ||
| # Create custom pricing rules (no PVPC) | ||
| custom_pricing_rules = PricingRules( | ||
| p1_kw_year_eur=30.0, | ||
| p2_kw_year_eur=20.0, | ||
| p1_kwh_eur=0.15, # Custom prices - no PVPC | ||
| p2_kwh_eur=0.12, | ||
| p3_kwh_eur=0.10, | ||
| surplus_p1_kwh_eur=0.06, | ||
| surplus_p2_kwh_eur=0.04, | ||
| surplus_p3_kwh_eur=0.02, | ||
| meter_month_eur=3.0, | ||
| market_kw_year_eur=4.0, | ||
| electricity_tax=1.05, | ||
| iva_tax=1.21, | ||
| energy_formula="electricity_tax * iva_tax * kwh_eur * kwh", | ||
| power_formula="electricity_tax * iva_tax * (p1_kw * (p1_kw_year_eur + market_kw_year_eur) + p2_kw * p2_kw_year_eur) / 365 / 24", | ||
| others_formula="iva_tax * meter_month_eur / 30 / 24", | ||
| surplus_formula="electricity_tax * iva_tax * surplus_kwh * surplus_kwh_eur", | ||
| main_formula="energy_term + power_term + others_term - surplus_term", | ||
| ) | ||
| # Execute update_missing_costs | ||
| result = await billing_service.update_missing_costs( | ||
| cups=cups, | ||
| pricing_rules=custom_pricing_rules, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify successful result | ||
| assert result["success"] is True | ||
| assert result["cups"] == cups | ||
| assert result["pricing_config_hash"] == "test_hash_123" | ||
| # Verify statistics | ||
| stats = result["stats"] | ||
| assert stats["total_consumptions"] == 2 | ||
| assert stats["processed"] > 0 # Should have processed some records | ||
| # Verify database methods were called | ||
| mock_database_service.get_consumptions.assert_called_once_with( | ||
| cups, start_date, end_date | ||
| ) | ||
| mock_database_service.get_contracts.assert_called_once_with(cups) | ||
| mock_database_service.get_billing.assert_called_once() | ||
| mock_database_service.generate_pricing_config_hash.assert_called_once() | ||
| # Verify save_billing was called (at least once) | ||
| assert mock_database_service.save_billing.call_count > 0 | ||
| @pytest.mark.asyncio | ||
| async def test_get_daily_costs_with_existing_data( | ||
| self, billing_service, mock_database_service, sample_pricing_rules_custom | ||
| ): | ||
| """Test get_daily_costs with existing billing data.""" | ||
| from edata.models.database import BillingModel | ||
| # Create mock billing records for 2 days | ||
| base_date = datetime(2024, 1, 1, 0, 0, 0) | ||
| mock_billing_records = [] | ||
| # Create 48 hours of billing data (2 days) | ||
| for hour in range(48): | ||
| record = BillingModel( | ||
| datetime=base_date + timedelta(hours=hour), | ||
| cups="ES0012345678901234567890AB", | ||
| pricing_config_hash="test_hash", | ||
| total_eur=1.5 + (hour * 0.1), # Varying costs | ||
| energy_term=1.0 + (hour * 0.05), | ||
| power_term=0.3, | ||
| others_term=0.1, | ||
| surplus_term=0.1 + (hour * 0.05), | ||
| ) | ||
| mock_billing_records.append(record) | ||
| # Setup mocks | ||
| mock_database_service.generate_pricing_config_hash.return_value = "test_hash" | ||
| mock_database_service.get_billing.return_value = mock_billing_records | ||
| # Test parameters | ||
| cups = "ES0012345678901234567890AB" | ||
| start_date = datetime(2024, 1, 1) | ||
| end_date = datetime(2024, 1, 2, 23, 59, 59) | ||
| # Call method | ||
| result = await billing_service.get_daily_costs( | ||
| cups, sample_pricing_rules_custom, start_date, end_date | ||
| ) | ||
| # Assertions | ||
| assert len(result) == 2 # Two days | ||
| from edata.models.pricing import PricingAggregated | ||
| assert all(isinstance(item, PricingAggregated) for item in result) | ||
| # Check first day | ||
| first_day = result[0] | ||
| assert first_day.datetime.date() == datetime(2024, 1, 1).date() | ||
| assert first_day.delta_h == 24 # 24 hours | ||
| assert first_day.value_eur > 0 | ||
| # Check second day | ||
| second_day = result[1] | ||
| assert second_day.datetime.date() == datetime(2024, 1, 2).date() | ||
| assert second_day.delta_h == 24 # 24 hours | ||
| assert ( | ||
| second_day.value_eur > first_day.value_eur | ||
| ) # Should be higher due to increasing pattern | ||
| @pytest.mark.asyncio | ||
| async def test_get_daily_costs_without_existing_data( | ||
| self, billing_service, mock_database_service, sample_pricing_rules_custom | ||
| ): | ||
| """Test get_daily_costs when no billing data exists.""" | ||
| # Setup mocks - no existing data | ||
| mock_database_service.generate_pricing_config_hash.return_value = "test_hash" | ||
| mock_database_service.get_billing.side_effect = [ | ||
| [], | ||
| [], | ||
| ] # First call empty, second still empty | ||
| # Mock update_missing_costs to fail | ||
| with patch.object(billing_service, "update_missing_costs") as mock_update: | ||
| mock_update.return_value = {"success": False, "error": "Test error"} | ||
| # Test parameters | ||
| cups = "ES0012345678901234567890AB" | ||
| start_date = datetime(2024, 1, 1) | ||
| end_date = datetime(2024, 1, 1, 23, 59, 59) | ||
| # Call method | ||
| result = await billing_service.get_daily_costs( | ||
| cups, sample_pricing_rules_custom, start_date, end_date | ||
| ) | ||
| # Assertions | ||
| assert result == [] # Should return empty list when update fails | ||
| mock_update.assert_called_once() | ||
| @pytest.mark.asyncio | ||
| async def test_get_monthly_costs_with_existing_data( | ||
| self, billing_service, mock_database_service, sample_pricing_rules_custom | ||
| ): | ||
| """Test get_monthly_costs with existing billing data.""" | ||
| from edata.models.database import BillingModel | ||
| # Create mock billing records for 2 days in same month | ||
| base_date = datetime(2024, 1, 1, 0, 0, 0) | ||
| mock_billing_records = [] | ||
| # Create 48 hours of billing data (2 days) | ||
| for hour in range(48): | ||
| record = BillingModel( | ||
| datetime=base_date + timedelta(hours=hour), | ||
| cups="ES0012345678901234567890AB", | ||
| pricing_config_hash="test_hash", | ||
| total_eur=1.5 + (hour * 0.1), # Varying costs | ||
| energy_term=1.0 + (hour * 0.05), | ||
| power_term=0.3, | ||
| others_term=0.1, | ||
| surplus_term=0.1 + (hour * 0.05), | ||
| ) | ||
| mock_billing_records.append(record) | ||
| # Setup mocks | ||
| mock_database_service.generate_pricing_config_hash.return_value = "test_hash" | ||
| mock_database_service.get_billing.return_value = mock_billing_records | ||
| # Test parameters | ||
| cups = "ES0012345678901234567890AB" | ||
| start_date = datetime(2024, 1, 1) | ||
| end_date = datetime(2024, 1, 31, 23, 59, 59) | ||
| # Call method | ||
| result = await billing_service.get_monthly_costs( | ||
| cups, sample_pricing_rules_custom, start_date, end_date | ||
| ) | ||
| # Assertions | ||
| assert len(result) == 1 # One month | ||
| from edata.models.pricing import PricingAggregated | ||
| assert all(isinstance(item, PricingAggregated) for item in result) | ||
| # Check month | ||
| month_data = result[0] | ||
| assert month_data.datetime.date() == datetime(2024, 1, 1).date() | ||
| assert month_data.delta_h == 48 # 48 hours from mock data | ||
| assert month_data.value_eur > 0 | ||
| @pytest.mark.asyncio | ||
| async def test_get_monthly_costs_multiple_months( | ||
| self, billing_service, mock_database_service, sample_pricing_rules_custom | ||
| ): | ||
| """Test get_monthly_costs with data spanning multiple months.""" | ||
| from edata.models.database import BillingModel | ||
| # Create billing records spanning two months | ||
| records = [] | ||
| # January data (24 hours) | ||
| for hour in range(24): | ||
| record = BillingModel( | ||
| datetime=datetime(2024, 1, 15) + timedelta(hours=hour), | ||
| cups="ES0012345678901234567890AB", | ||
| pricing_config_hash="test_hash", | ||
| total_eur=1.0, | ||
| energy_term=0.7, | ||
| power_term=0.2, | ||
| others_term=0.1, | ||
| surplus_term=0.0, | ||
| ) | ||
| records.append(record) | ||
| # February data (24 hours) | ||
| for hour in range(24): | ||
| record = BillingModel( | ||
| datetime=datetime(2024, 2, 15) + timedelta(hours=hour), | ||
| cups="ES0012345678901234567890AB", | ||
| pricing_config_hash="test_hash", | ||
| total_eur=1.2, | ||
| energy_term=0.8, | ||
| power_term=0.3, | ||
| others_term=0.1, | ||
| surplus_term=0.0, | ||
| ) | ||
| records.append(record) | ||
| # Setup mocks | ||
| mock_database_service.generate_pricing_config_hash.return_value = "test_hash" | ||
| mock_database_service.get_billing.return_value = records | ||
| # Test parameters | ||
| cups = "ES0012345678901234567890AB" | ||
| start_date = datetime(2024, 1, 1) | ||
| end_date = datetime(2024, 2, 28, 23, 59, 59) | ||
| # Call method | ||
| result = await billing_service.get_monthly_costs( | ||
| cups, sample_pricing_rules_custom, start_date, end_date | ||
| ) | ||
| # Assertions | ||
| assert len(result) == 2 # Two months | ||
| # Check January | ||
| jan_data = result[0] | ||
| assert jan_data.datetime.date() == datetime(2024, 1, 1).date() | ||
| assert jan_data.delta_h == 24 | ||
| assert jan_data.value_eur == 24.0 # 24 hours * 1.0 EUR | ||
| # Check February | ||
| feb_data = result[1] | ||
| assert feb_data.datetime.date() == datetime(2024, 2, 1).date() | ||
| assert feb_data.delta_h == 24 | ||
| assert feb_data.value_eur == 28.8 # 24 hours * 1.2 EUR | ||
| @pytest.mark.asyncio | ||
| async def test_get_daily_costs_error_handling( | ||
| self, billing_service, mock_database_service, sample_pricing_rules_custom | ||
| ): | ||
| """Test error handling in get_daily_costs.""" | ||
| # Setup mocks to raise exception | ||
| mock_database_service.generate_pricing_config_hash.side_effect = Exception( | ||
| "Database error" | ||
| ) | ||
| # Test parameters | ||
| cups = "ES0012345678901234567890AB" | ||
| start_date = datetime(2024, 1, 1) | ||
| end_date = datetime(2024, 1, 1, 23, 59, 59) | ||
| # Call method and expect exception | ||
| with pytest.raises(Exception, match="Database error"): | ||
| await billing_service.get_daily_costs( | ||
| cups, sample_pricing_rules_custom, start_date, end_date | ||
| ) | ||
| @pytest.mark.asyncio | ||
| async def test_get_monthly_costs_error_handling( | ||
| self, billing_service, mock_database_service, sample_pricing_rules_custom | ||
| ): | ||
| """Test error handling in get_monthly_costs.""" | ||
| # Setup mocks to raise exception | ||
| mock_database_service.generate_pricing_config_hash.side_effect = Exception( | ||
| "Database error" | ||
| ) | ||
| # Test parameters | ||
| cups = "ES0012345678901234567890AB" | ||
| start_date = datetime(2024, 1, 1) | ||
| end_date = datetime(2024, 1, 31, 23, 59, 59) | ||
| # Call method and expect exception | ||
| with pytest.raises(Exception, match="Database error"): | ||
| await billing_service.get_monthly_costs( | ||
| cups, sample_pricing_rules_custom, start_date, end_date | ||
| ) |
| """Tests for ConsumptionService.""" | ||
| import shutil | ||
| import tempfile | ||
| from datetime import date, datetime, timedelta | ||
| from unittest.mock import AsyncMock, Mock, patch | ||
| import pytest | ||
| import pytest_asyncio | ||
| from edata.connectors.datadis import DatadisConnector | ||
| from edata.models.consumption import Consumption, ConsumptionAggregated | ||
| from edata.services.consumption import ConsumptionService | ||
| class TestConsumptionService: | ||
| """Test suite for ConsumptionService.""" | ||
| @pytest.fixture | ||
| def temp_dir(self): | ||
| """Create a temporary directory for tests.""" | ||
| temp_dir = tempfile.mkdtemp() | ||
| yield temp_dir | ||
| shutil.rmtree(temp_dir) | ||
| @pytest.fixture | ||
| def mock_datadis_connector(self): | ||
| """Mock DatadisConnector for testing.""" | ||
| with patch( | ||
| "edata.services.consumption.DatadisConnector" | ||
| ) as mock_connector_class: | ||
| mock_connector = Mock(spec=DatadisConnector) | ||
| mock_connector_class.return_value = mock_connector | ||
| yield mock_connector, mock_connector_class | ||
| @pytest.fixture | ||
| def mock_database_service(self): | ||
| """Mock DatabaseService for testing.""" | ||
| with patch("edata.services.consumption.get_database_service") as mock_get_db: | ||
| mock_db = Mock() | ||
| # Hacer que los métodos async retornen AsyncMock | ||
| mock_db.get_consumptions = AsyncMock(return_value=[]) | ||
| mock_db.save_consumption = AsyncMock(return_value=Mock()) | ||
| mock_db.get_latest_consumption = AsyncMock(return_value=None) | ||
| mock_get_db.return_value = mock_db | ||
| yield mock_db | ||
| @pytest_asyncio.fixture | ||
| async def consumption_service( | ||
| self, temp_dir, mock_datadis_connector, mock_database_service | ||
| ): | ||
| """Create a ConsumptionService instance for testing.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| return ConsumptionService( | ||
| datadis_connector=mock_connector, | ||
| storage_dir=temp_dir, | ||
| ) | ||
| @pytest.fixture | ||
| def sample_consumptions(self): | ||
| """Sample consumption data for testing.""" | ||
| return [ | ||
| Consumption( | ||
| datetime=datetime(2024, 6, 15, 10, 0), | ||
| delta_h=1.0, | ||
| value_kwh=0.5, | ||
| surplus_kwh=0.0, | ||
| real=True, | ||
| ), | ||
| Consumption( | ||
| datetime=datetime(2024, 6, 15, 11, 0), | ||
| delta_h=1.0, | ||
| value_kwh=0.7, | ||
| surplus_kwh=0.0, | ||
| real=True, | ||
| ), | ||
| Consumption( | ||
| datetime=datetime(2024, 6, 15, 12, 0), | ||
| delta_h=1.0, | ||
| value_kwh=0.6, | ||
| surplus_kwh=0.0, | ||
| real=True, | ||
| ), | ||
| ] | ||
| @pytest.mark.asyncio | ||
| async def test_initialization( | ||
| self, temp_dir, mock_datadis_connector, mock_database_service | ||
| ): | ||
| """Test ConsumptionService initialization.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| service = ConsumptionService( | ||
| datadis_connector=mock_connector, | ||
| storage_dir=temp_dir, | ||
| ) | ||
| # Verify service stores the connector and storage directory | ||
| assert service._datadis == mock_connector | ||
| assert service._storage_dir == temp_dir | ||
| # Verify database service is obtained lazily by calling _get_db_service | ||
| db_service = await service._get_db_service() | ||
| assert db_service is mock_database_service | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumptions_success( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test successful consumption update.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock datadis connector response (now returns Pydantic models) | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| # Mock database service responses - no existing consumptions | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute update | ||
| result = await consumption_service.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify datadis connector was called correctly | ||
| mock_connector.get_consumption_data.assert_called_once_with( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| measurement_type="0", | ||
| point_type=5, | ||
| authorized_nif=None, | ||
| ) | ||
| # Verify database service was called for each consumption | ||
| assert mock_database_service.save_consumption.call_count == len( | ||
| sample_consumptions | ||
| ) | ||
| # Verify result structure | ||
| assert result["success"] is True | ||
| assert result["cups"] == cups | ||
| assert result["period"]["start"] == start_date.isoformat() | ||
| assert result["period"]["end"] == end_date.isoformat() | ||
| assert result["stats"]["fetched"] == len(sample_consumptions) | ||
| assert result["stats"]["saved"] == len(sample_consumptions) | ||
| assert result["stats"]["updated"] == 0 | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumptions_with_existing_data( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test consumption update with some existing data.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock datadis connector response (now returns Pydantic models) | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| # Mock get_latest_consumption to return an existing consumption before the start date | ||
| mock_latest = Mock() | ||
| mock_latest.datetime = datetime(2024, 6, 14, 23, 0) # Day before start_date | ||
| mock_database_service.get_latest_consumption.return_value = mock_latest | ||
| # Mock database service responses - first consumption exists, others don't | ||
| def mock_get_consumptions(cups, start_date, end_date): | ||
| if start_date == sample_consumptions[0].datetime: | ||
| return [Mock()] # Existing consumption | ||
| return [] # No existing consumption | ||
| mock_database_service.get_consumptions.side_effect = mock_get_consumptions | ||
| # Execute update | ||
| result = await consumption_service.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify result | ||
| assert result["success"] is True | ||
| assert result["stats"]["fetched"] == len(sample_consumptions) | ||
| assert result["stats"]["saved"] == 2 # Two new consumptions | ||
| assert result["stats"]["updated"] == 1 # One updated consumption | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumptions_with_optional_parameters( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test consumption update with optional parameters.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| measurement_type = "1" | ||
| point_type = 3 | ||
| authorized_nif = "12345678A" | ||
| # Mock datadis connector response (now returns Pydantic models) | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute update with optional parameters | ||
| result = await consumption_service.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| measurement_type=measurement_type, | ||
| point_type=point_type, | ||
| authorized_nif=authorized_nif, | ||
| ) | ||
| # Verify datadis connector was called with optional parameters | ||
| mock_connector.get_consumption_data.assert_called_once_with( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| measurement_type=measurement_type, | ||
| point_type=point_type, | ||
| authorized_nif=authorized_nif, | ||
| ) | ||
| assert result["success"] is True | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumptions_error_handling( | ||
| self, consumption_service, mock_datadis_connector, mock_database_service | ||
| ): | ||
| """Test consumption update error handling.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock datadis connector to raise an exception | ||
| error_message = "API connection failed" | ||
| mock_connector.get_consumption_data.side_effect = Exception(error_message) | ||
| # Mock database service to return None for get_latest_consumption (no existing data) | ||
| mock_database_service.get_latest_consumption.return_value = None | ||
| # Execute update | ||
| result = await consumption_service.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify error result | ||
| assert result["success"] is False | ||
| assert result["cups"] == cups | ||
| assert result["error"] == error_message | ||
| assert result["period"]["start"] == start_date.isoformat() | ||
| assert result["period"]["end"] == end_date.isoformat() | ||
| # Verify database service was not called | ||
| mock_database_service.save_consumption.assert_not_called() | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumptions_with_force_full_update( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test consumption update with force_full_update=True ignores existing data.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock datadis connector response | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| # Mock get_latest_consumption to return existing data | ||
| mock_latest = Mock() | ||
| mock_latest.datetime = datetime( | ||
| 2024, 6, 15, 12, 0 | ||
| ) # Within the requested range | ||
| mock_database_service.get_latest_consumption.return_value = mock_latest | ||
| # Mock database service responses - no existing consumptions | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute update with force_full_update=True | ||
| result = await consumption_service.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| force_full_update=True, | ||
| ) | ||
| # Verify datadis connector was called with original start_date (not optimized) | ||
| mock_connector.get_consumption_data.assert_called_once_with( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, # Should use original start_date, not optimized | ||
| end_date=end_date, | ||
| measurement_type="0", | ||
| point_type=5, | ||
| authorized_nif=None, | ||
| ) | ||
| # Verify result | ||
| assert result["success"] is True | ||
| assert result["stats"]["fetched"] == len(sample_consumptions) | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumptions_incremental_optimization( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test that consumption update optimizes by starting from last consumption date.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock datadis connector response | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| # Mock get_latest_consumption to return existing data | ||
| mock_latest = Mock() | ||
| mock_latest.datetime = datetime(2024, 6, 15, 8, 0) # 8 AM on same day | ||
| mock_database_service.get_latest_consumption.return_value = mock_latest | ||
| # Mock database service responses - no existing consumptions for the new range | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute update | ||
| result = await consumption_service.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify datadis connector was called with optimized start_date (9 AM) | ||
| expected_optimized_start = datetime(2024, 6, 15, 9, 0) # last + 1 hour | ||
| mock_connector.get_consumption_data.assert_called_once_with( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=expected_optimized_start, # Should be optimized | ||
| end_date=end_date, | ||
| measurement_type="0", | ||
| point_type=5, | ||
| authorized_nif=None, | ||
| ) | ||
| # Verify result includes message about optimization | ||
| assert result["success"] is True | ||
| assert "message" in result | ||
| assert "missing data" in result["message"] | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumptions_up_to_date( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| ): | ||
| """Test consumption update when data is already up to date.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock get_latest_consumption to return data beyond end_date | ||
| mock_latest = Mock() | ||
| mock_latest.datetime = datetime(2024, 6, 16, 1, 0) # After end_date | ||
| mock_database_service.get_latest_consumption.return_value = mock_latest | ||
| # Execute update | ||
| result = await consumption_service.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify datadis connector was NOT called (data is up to date) | ||
| mock_connector.get_consumption_data.assert_not_called() | ||
| # Verify result indicates no new data needed | ||
| assert result["success"] is True | ||
| assert result["stats"]["fetched"] == 0 | ||
| assert result["stats"]["skipped"] == "up_to_date" | ||
| assert "up to date" in result["message"] | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumption_range_by_months_single_month( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test consumption range update for a single month.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 1, 0, 0) | ||
| end_date = datetime(2024, 6, 30, 23, 59) | ||
| # Mock datadis connector response (now returns Pydantic models) | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute range update | ||
| result = await consumption_service.update_consumption_range_by_months( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify result structure | ||
| assert result["success"] is True | ||
| assert result["cups"] == cups | ||
| assert result["months_processed"] == 1 | ||
| assert result["total_stats"]["consumptions_fetched"] == len(sample_consumptions) | ||
| assert result["total_stats"]["consumptions_saved"] == len(sample_consumptions) | ||
| assert result["total_stats"]["consumptions_updated"] == 0 | ||
| assert len(result["monthly_results"]) == 1 | ||
| # Verify monthly result | ||
| monthly_result = result["monthly_results"][0] | ||
| assert monthly_result["month"] == "2024-06" | ||
| assert monthly_result["consumption"]["success"] is True | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumption_range_by_months_multiple_months( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test consumption range update for multiple months.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 5, 15, 0, 0) | ||
| end_date = datetime(2024, 7, 15, 23, 59) | ||
| # Mock datadis connector response (now returns Pydantic models) | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute range update | ||
| result = await consumption_service.update_consumption_range_by_months( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Should process 3 months: May (partial), June (full), July (partial) | ||
| assert result["months_processed"] == 3 | ||
| assert len(result["monthly_results"]) == 3 | ||
| # Verify month identifiers | ||
| months = [r["month"] for r in result["monthly_results"]] | ||
| assert "2024-05" in months | ||
| assert "2024-06" in months | ||
| assert "2024-07" in months | ||
| # Verify total stats | ||
| expected_total_fetched = len(sample_consumptions) * 3 | ||
| assert result["total_stats"]["consumptions_fetched"] == expected_total_fetched | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumption_range_by_months_with_errors( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test consumption range update with some months failing.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 1, 0, 0) | ||
| end_date = datetime(2024, 8, 31, 23, 59) | ||
| # Mock datadis connector to fail on second call | ||
| call_count = 0 | ||
| def mock_get_consumption_data(*args, **kwargs): | ||
| nonlocal call_count | ||
| call_count += 1 | ||
| if call_count == 2: # Second month fails | ||
| raise Exception("API rate limit exceeded") | ||
| return sample_consumptions | ||
| mock_connector.get_consumption_data.side_effect = mock_get_consumption_data | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute range update | ||
| result = await consumption_service.update_consumption_range_by_months( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Should process 3 months but with one failure | ||
| assert result["months_processed"] == 3 | ||
| assert result["success"] is False # Overall failure due to one failed month | ||
| # Check individual month results | ||
| successful_months = [ | ||
| r for r in result["monthly_results"] if r["consumption"]["success"] | ||
| ] | ||
| failed_months = [ | ||
| r for r in result["monthly_results"] if not r["consumption"]["success"] | ||
| ] | ||
| assert len(successful_months) == 2 | ||
| assert len(failed_months) == 1 | ||
| @pytest.mark.asyncio | ||
| async def test_update_consumption_range_year_boundary( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test consumption range update across year boundary.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2023, 12, 1, 0, 0) | ||
| end_date = datetime(2024, 2, 28, 23, 59) | ||
| # Mock datadis connector response (now returns Pydantic models) | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute range update | ||
| result = await consumption_service.update_consumption_range_by_months( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Should process 3 months: December 2023, January 2024, February 2024 | ||
| assert result["months_processed"] == 3 | ||
| # Verify month identifiers | ||
| months = [r["month"] for r in result["monthly_results"]] | ||
| assert "2023-12" in months | ||
| assert "2024-01" in months | ||
| assert "2024-02" in months | ||
| @pytest.mark.asyncio | ||
| async def test_get_stored_consumptions_no_filters( | ||
| self, consumption_service, mock_database_service, sample_consumptions | ||
| ): | ||
| """Test getting stored consumptions without date filters.""" | ||
| cups = "ES1234567890123456789" | ||
| # Mock database service response | ||
| mock_database_service.get_consumptions.return_value = sample_consumptions | ||
| # Execute get stored consumptions | ||
| result = await consumption_service.get_stored_consumptions(cups) | ||
| # Verify database service was called correctly | ||
| mock_database_service.get_consumptions.assert_called_once_with(cups, None, None) | ||
| # Verify result | ||
| assert result == sample_consumptions | ||
| @pytest.mark.asyncio | ||
| async def test_get_stored_consumptions_with_filters( | ||
| self, consumption_service, mock_database_service, sample_consumptions | ||
| ): | ||
| """Test getting stored consumptions with date filters.""" | ||
| cups = "ES1234567890123456789" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock database service response | ||
| filtered_consumptions = sample_consumptions[:2] # Return first two | ||
| mock_database_service.get_consumptions.return_value = filtered_consumptions | ||
| # Execute get stored consumptions with filters | ||
| result = await consumption_service.get_stored_consumptions( | ||
| cups=cups, start_date=start_date, end_date=end_date | ||
| ) | ||
| # Verify database service was called correctly | ||
| mock_database_service.get_consumptions.assert_called_once_with( | ||
| cups, start_date, end_date | ||
| ) | ||
| # Verify result | ||
| assert result == filtered_consumptions | ||
| @pytest.mark.asyncio | ||
| async def test_initialization_default_parameters( | ||
| self, temp_dir, mock_datadis_connector, mock_database_service | ||
| ): | ||
| """Test ConsumptionService initialization with default parameters.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| service = ConsumptionService(datadis_connector=mock_connector) | ||
| # Verify service stores the connector with default storage directory | ||
| assert service._datadis == mock_connector | ||
| assert service._storage_dir is None | ||
| @patch("edata.services.consumption._LOGGER") | ||
| @pytest.mark.asyncio | ||
| async def test_logging_during_operations( | ||
| self, | ||
| mock_logger, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_consumptions, | ||
| ): | ||
| """Test that appropriate logging occurs during operations.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| distributor_code = "123" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock datadis connector response (now returns Pydantic models) | ||
| mock_connector.get_consumption_data.return_value = sample_consumptions | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute update | ||
| await consumption_service.update_consumptions( | ||
| cups=cups, | ||
| distributor_code=distributor_code, | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Verify logging calls | ||
| assert mock_logger.info.call_count >= 2 # Start and completion logs | ||
| # Verify log messages contain expected information | ||
| log_calls = [call.args[0] for call in mock_logger.info.call_args_list] | ||
| assert any("Updating consumptions" in msg for msg in log_calls) | ||
| assert any("Consumption update completed" in msg for msg in log_calls) | ||
| @pytest.fixture | ||
| def sample_db_consumptions(self): | ||
| """Sample database consumption data for aggregation testing.""" | ||
| from edata.services.database import ConsumptionModel as DbConsumption | ||
| # Use Monday (weekday 0) instead of Saturday for proper tariff testing | ||
| base_date = datetime(2024, 6, 17) # Monday, June 17, 2024 | ||
| # Create 48 hours of hourly data (2 days: Monday and Tuesday) | ||
| db_consumptions = [] | ||
| for hour in range(48): | ||
| dt = base_date + timedelta(hours=hour) | ||
| # Vary consumption by hour to test tariff periods | ||
| if 10 <= dt.hour <= 13 or 18 <= dt.hour <= 21: # P1 hours | ||
| kwh = 1.5 | ||
| elif dt.hour in [8, 9, 14, 15, 16, 17, 22, 23]: # P2 hours | ||
| kwh = 1.0 | ||
| else: # P3 hours | ||
| kwh = 0.5 | ||
| db_cons = Mock(spec=DbConsumption) | ||
| db_cons.datetime = dt | ||
| db_cons.delta_h = 1.0 | ||
| db_cons.value_kwh = kwh | ||
| db_cons.surplus_kwh = ( | ||
| 0.1 if hour % 10 == 0 else 0.0 | ||
| ) # Some surplus every 10 hours | ||
| db_cons.real = True | ||
| db_consumptions.append(db_cons) | ||
| return db_consumptions | ||
| @pytest.mark.asyncio | ||
| async def test_get_daily_consumptions( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_db_consumptions, | ||
| ): | ||
| """Test daily consumption aggregation.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| start_date = datetime(2024, 6, 17, 0, 0) # Monday | ||
| end_date = datetime(2024, 6, 18, 23, 59) # Tuesday | ||
| # Mock database service to return sample data | ||
| mock_database_service.get_consumptions.return_value = sample_db_consumptions | ||
| # Execute daily aggregation | ||
| daily_consumptions = await consumption_service.get_daily_consumptions( | ||
| cups=cups, start_date=start_date, end_date=end_date | ||
| ) | ||
| # Verify database service was called correctly | ||
| mock_database_service.get_consumptions.assert_called_once_with( | ||
| cups, start_date, end_date | ||
| ) | ||
| # Should have 2 days of data | ||
| assert len(daily_consumptions) == 2 | ||
| # Verify first day aggregation | ||
| day1 = daily_consumptions[0] | ||
| assert isinstance(day1, ConsumptionAggregated) | ||
| assert day1.datetime.date() == date(2024, 6, 17) # Monday | ||
| assert day1.delta_h == 24.0 # 24 hours | ||
| # Verify total consumption (should be sum of all hourly values) | ||
| expected_day1_total = (8 * 1.5) + (8 * 1.0) + (8 * 0.5) # P1 + P2 + P3 hours | ||
| assert day1.value_kwh == expected_day1_total | ||
| # Verify P1 consumption (hours 10-13, 18-21) | ||
| expected_p1 = 8 * 1.5 # 8 P1 hours at 1.5 kWh each | ||
| assert day1.value_p1_kwh == expected_p1 | ||
| # Verify some surplus was recorded | ||
| assert day1.surplus_kwh > 0 | ||
| # Verify second day | ||
| day2 = daily_consumptions[1] | ||
| assert day2.datetime.date() == date(2024, 6, 18) # Tuesday | ||
| assert day2.delta_h == 24.0 | ||
| @pytest.mark.asyncio | ||
| async def test_get_monthly_consumptions( | ||
| self, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| sample_db_consumptions, | ||
| ): | ||
| """Test monthly consumption aggregation.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| start_date = datetime(2024, 6, 17, 0, 0) # Monday | ||
| end_date = datetime(2024, 6, 18, 23, 59) # Tuesday | ||
| # Mock database service to return sample data | ||
| mock_database_service.get_consumptions.return_value = sample_db_consumptions | ||
| # Execute monthly aggregation | ||
| monthly_consumptions = await consumption_service.get_monthly_consumptions( | ||
| cups=cups, start_date=start_date, end_date=end_date | ||
| ) | ||
| # Verify database service was called correctly | ||
| mock_database_service.get_consumptions.assert_called_once_with( | ||
| cups, start_date, end_date | ||
| ) | ||
| # Should have 1 month of data (both days in same month) | ||
| assert len(monthly_consumptions) == 1 | ||
| # Verify monthly aggregation | ||
| month = monthly_consumptions[0] | ||
| assert isinstance(month, ConsumptionAggregated) | ||
| assert month.datetime.replace(day=1).date() == date(2024, 6, 1) | ||
| assert month.delta_h == 48.0 # 48 hours total | ||
| # Verify total consumption (should be sum of both days) | ||
| expected_total = 2 * ((8 * 1.5) + (8 * 1.0) + (8 * 0.5)) | ||
| assert month.value_kwh == expected_total | ||
| # Verify P1 consumption | ||
| expected_p1 = 2 * (8 * 1.5) # 2 days * 8 P1 hours * 1.5 kWh | ||
| assert month.value_p1_kwh == expected_p1 | ||
| @pytest.mark.asyncio | ||
| async def test_get_monthly_consumptions_with_cycle_start_day( | ||
| self, consumption_service, mock_datadis_connector, mock_database_service | ||
| ): | ||
| """Test monthly consumption aggregation with custom cycle start day.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| start_date = datetime(2024, 6, 1, 0, 0) | ||
| end_date = datetime(2024, 6, 30, 23, 59) | ||
| # Create sample data spanning across billing cycle boundary | ||
| from edata.services.database import ConsumptionModel as DbConsumption | ||
| db_consumptions = [] | ||
| # Data on June 14th (before cycle start) | ||
| dt1 = datetime(2024, 6, 14, 12, 0) | ||
| db_cons1 = Mock(spec=DbConsumption) | ||
| db_cons1.datetime = dt1 | ||
| db_cons1.delta_h = 1.0 | ||
| db_cons1.value_kwh = 2.0 | ||
| db_cons1.surplus_kwh = 0.0 | ||
| db_cons1.real = True | ||
| db_consumptions.append(db_cons1) | ||
| # Data on June 16th (after cycle start) | ||
| dt2 = datetime(2024, 6, 16, 12, 0) | ||
| db_cons2 = Mock(spec=DbConsumption) | ||
| db_cons2.datetime = dt2 | ||
| db_cons2.delta_h = 1.0 | ||
| db_cons2.value_kwh = 3.0 | ||
| db_cons2.surplus_kwh = 0.0 | ||
| db_cons2.real = True | ||
| db_consumptions.append(db_cons2) | ||
| mock_database_service.get_consumptions.return_value = db_consumptions | ||
| # Execute with cycle start day = 15 | ||
| monthly_consumptions = await consumption_service.get_monthly_consumptions( | ||
| cups=cups, start_date=start_date, end_date=end_date, cycle_start_day=15 | ||
| ) | ||
| # Should have 2 months (May billing period and June billing period) | ||
| assert len(monthly_consumptions) == 2 | ||
| # Verify the months | ||
| months = sorted([m.datetime for m in monthly_consumptions]) | ||
| assert months[0].month == 5 # May billing period (for June 14th data) | ||
| assert months[1].month == 6 # June billing period (for June 16th data) | ||
| @pytest.mark.asyncio | ||
| async def test_get_daily_consumptions_empty_data( | ||
| self, consumption_service, mock_datadis_connector, mock_database_service | ||
| ): | ||
| """Test daily consumption aggregation with no data.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| start_date = datetime(2024, 6, 17, 0, 0) # Monday | ||
| end_date = datetime(2024, 6, 17, 23, 59) # Monday | ||
| # Mock database service to return empty data | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute daily aggregation | ||
| daily_consumptions = await consumption_service.get_daily_consumptions( | ||
| cups=cups, start_date=start_date, end_date=end_date | ||
| ) | ||
| # Should return empty list | ||
| assert len(daily_consumptions) == 0 | ||
| @pytest.mark.asyncio | ||
| async def test_get_monthly_consumptions_empty_data( | ||
| self, consumption_service, mock_datadis_connector, mock_database_service | ||
| ): | ||
| """Test monthly consumption aggregation with no data.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| start_date = datetime(2024, 6, 15, 0, 0) | ||
| end_date = datetime(2024, 6, 15, 23, 59) | ||
| # Mock database service to return empty data | ||
| mock_database_service.get_consumptions.return_value = [] | ||
| # Execute monthly aggregation | ||
| monthly_consumptions = await consumption_service.get_monthly_consumptions( | ||
| cups=cups, start_date=start_date, end_date=end_date | ||
| ) | ||
| # Should return empty list | ||
| assert len(monthly_consumptions) == 0 | ||
| @pytest.mark.asyncio | ||
| @patch("edata.services.consumption.get_pvpc_tariff") | ||
| async def test_tariff_calculation_in_aggregations( | ||
| self, | ||
| mock_get_pvpc_tariff, | ||
| consumption_service, | ||
| mock_datadis_connector, | ||
| mock_database_service, | ||
| ): | ||
| """Test that tariff calculation is used correctly in aggregations.""" | ||
| mock_connector, mock_connector_class = mock_datadis_connector | ||
| cups = "ES1234567890123456789" | ||
| start_date = datetime(2024, 6, 17, 0, 0) # Monday | ||
| end_date = datetime(2024, 6, 17, 23, 59) # Monday | ||
| # Create single consumption data | ||
| from edata.services.database import ConsumptionModel as DbConsumption | ||
| db_cons = Mock(spec=DbConsumption) | ||
| db_cons.datetime = datetime(2024, 6, 17, 12, 0) # Monday noon | ||
| db_cons.delta_h = 1.0 | ||
| db_cons.value_kwh = 2.0 | ||
| db_cons.surplus_kwh = 0.1 | ||
| db_cons.real = True | ||
| mock_database_service.get_consumptions.return_value = [db_cons] | ||
| # Mock tariff calculation to return P2 | ||
| mock_get_pvpc_tariff.return_value = "p2" | ||
| # Execute daily aggregation with await | ||
| daily_consumptions = await consumption_service.get_daily_consumptions( | ||
| cups=cups, start_date=start_date, end_date=end_date | ||
| ) | ||
| # Verify tariff function was called | ||
| mock_get_pvpc_tariff.assert_called_with(datetime(2024, 6, 17, 12, 0)) | ||
| # Verify P2 values were set correctly | ||
| assert len(daily_consumptions) == 1 | ||
| day = daily_consumptions[0] | ||
| assert day.value_p2_kwh == 2.0 | ||
| assert day.surplus_p2_kwh == 0.1 | ||
| assert day.value_p1_kwh == 0.0 | ||
| assert day.value_p3_kwh == 0.0 |
| """Tests for ContractService.""" | ||
| from datetime import datetime | ||
| from unittest.mock import AsyncMock, Mock, patch | ||
| import pytest | ||
| import pytest_asyncio | ||
| from edata.models.database import ContractModel | ||
| from edata.services.contract import ContractService | ||
| @pytest_asyncio.fixture | ||
| async def contract_service(): | ||
| """Create a contract service with mocked dependencies.""" | ||
| with patch("edata.services.contract.get_database_service") as mock_db_factory: | ||
| mock_db = Mock() | ||
| # Hacer que los métodos de la base de datos retornen AsyncMock | ||
| mock_db.get_contracts = AsyncMock(return_value=[]) | ||
| mock_db.save_contract = AsyncMock(return_value=Mock()) | ||
| mock_db_factory.return_value = mock_db | ||
| # Create a mock DatadisConnector | ||
| mock_datadis_connector = Mock() | ||
| service = ContractService(datadis_connector=mock_datadis_connector) | ||
| service._db_service = mock_db | ||
| return service | ||
| @pytest.fixture | ||
| def sample_contracts(): | ||
| """Sample contract data for testing.""" | ||
| return [ | ||
| ContractModel( | ||
| cups="ES0012345678901234567890AB", | ||
| date_start=datetime(2023, 1, 1), | ||
| date_end=datetime(2023, 12, 31), | ||
| marketer="Test Marketer 1", | ||
| distributor_code="123", | ||
| power_p1=4.6, | ||
| power_p2=4.6, | ||
| ), | ||
| ContractModel( | ||
| cups="ES0012345678901234567890AB", | ||
| date_start=datetime(2024, 1, 1), | ||
| date_end=datetime(2024, 12, 31), | ||
| marketer="Test Marketer 2", | ||
| distributor_code="123", | ||
| power_p1=5.0, | ||
| power_p2=5.0, | ||
| ), | ||
| ] | ||
| class TestContractService: | ||
| """Test class for ContractService.""" | ||
| @pytest.mark.asyncio | ||
| async def test_update_contracts_success(self, contract_service, sample_contracts): | ||
| """Test successful contract update.""" | ||
| # Setup mocks - now returns Pydantic models instead of dicts | ||
| contract_service._datadis.get_contract_detail = AsyncMock( | ||
| return_value=[ | ||
| ContractModel( | ||
| cups="ES0012345678901234567890AB", | ||
| date_start=datetime(2024, 1, 1), | ||
| date_end=datetime(2024, 12, 31), | ||
| marketer="Test Marketer", | ||
| distributor_code="123", | ||
| power_p1=5.0, | ||
| power_p2=5.0, | ||
| ) | ||
| ] | ||
| ) | ||
| contract_service._db_service.get_contracts.return_value = [] | ||
| contract_service._db_service.save_contract.return_value = sample_contracts[0] | ||
| # Execute | ||
| result = await contract_service.update_contracts( | ||
| cups="ES0012345678901234567890AB", distributor_code="123" | ||
| ) | ||
| # Verify | ||
| assert result["success"] is True | ||
| assert result["stats"]["fetched"] == 1 | ||
| assert result["stats"]["saved"] == 1 | ||
| assert result["stats"]["updated"] == 0 | ||
| contract_service._datadis.get_contract_detail.assert_called_once() | ||
| contract_service._db_service.save_contract.assert_called_once() | ||
| @pytest.mark.asyncio | ||
| async def test_get_contracts(self, contract_service, sample_contracts): | ||
| """Test getting contracts.""" | ||
| # Setup mocks | ||
| contract_service._db_service.get_contracts.return_value = sample_contracts | ||
| # Execute | ||
| result = await contract_service.get_contracts("ES0012345678901234567890AB") | ||
| # Verify | ||
| assert len(result) == 2 | ||
| assert result[0].power_p1 == 4.6 | ||
| assert result[1].power_p1 == 5.0 | ||
| contract_service._db_service.get_contracts.assert_called_once_with( | ||
| cups="ES0012345678901234567890AB" | ||
| ) | ||
| @pytest.mark.asyncio | ||
| async def test_get_active_contract(self, contract_service, sample_contracts): | ||
| """Test getting active contract.""" | ||
| # Setup mocks | ||
| contract_service._db_service.get_contracts.return_value = sample_contracts | ||
| # Execute - test with date in 2024 | ||
| result = await contract_service.get_active_contract( | ||
| "ES0012345678901234567890AB", datetime(2024, 6, 15) | ||
| ) | ||
| # Verify | ||
| assert result is not None | ||
| assert result.power_p1 == 5.0 # Should return 2024 contract | ||
| assert result.date_start.year == 2024 | ||
| @pytest.mark.asyncio | ||
| async def test_get_most_recent_contract(self, contract_service, sample_contracts): | ||
| """Test getting most recent contract.""" | ||
| # Setup mocks | ||
| contract_service._db_service.get_contracts.return_value = sample_contracts | ||
| # Execute - use the correct method name | ||
| result = await contract_service.get_latest_contract( | ||
| "ES0012345678901234567890AB" | ||
| ) | ||
| # Verify | ||
| assert result is not None | ||
| assert result.power_p1 == 5.0 # Should return 2024 contract (most recent) | ||
| assert result.date_start.year == 2024 | ||
| @pytest.mark.asyncio | ||
| async def test_get_contract_stats(self, contract_service, sample_contracts): | ||
| """Test getting contract statistics.""" | ||
| # Setup mocks | ||
| contract_service._db_service.get_contracts.return_value = sample_contracts | ||
| # Execute | ||
| result = await contract_service.get_contract_stats("ES0012345678901234567890AB") | ||
| # Verify | ||
| assert result["total_contracts"] == 2 | ||
| assert result["power_ranges"]["p1_kw"]["min"] == 4.6 | ||
| assert result["power_ranges"]["p1_kw"]["max"] == 5.0 | ||
| assert result["power_ranges"]["p2_kw"]["min"] == 4.6 | ||
| assert result["power_ranges"]["p2_kw"]["max"] == 5.0 | ||
| assert result["date_range"]["earliest_start"] == datetime(2023, 1, 1) | ||
| assert result["date_range"]["latest_end"] == datetime(2024, 12, 31) | ||
| @pytest.mark.asyncio | ||
| async def test_update_contracts_no_data(self, contract_service): | ||
| """Test contract update with no data returned.""" | ||
| # Setup mocks | ||
| contract_service._datadis.get_contract_detail = AsyncMock(return_value=[]) | ||
| # Execute | ||
| result = await contract_service.update_contracts( | ||
| cups="ES0012345678901234567890AB", distributor_code="123" | ||
| ) | ||
| # Verify | ||
| assert result["success"] is True | ||
| assert result["stats"]["fetched"] == 0 | ||
| assert result["stats"]["saved"] == 0 | ||
| @pytest.mark.asyncio | ||
| async def test_update_contracts_error(self, contract_service): | ||
| """Test contract update with error.""" | ||
| # Setup mocks | ||
| contract_service._datadis.get_contract_detail = AsyncMock( | ||
| side_effect=Exception("API Error") | ||
| ) | ||
| # Execute | ||
| result = await contract_service.update_contracts( | ||
| cups="ES0012345678901234567890AB", distributor_code="123" | ||
| ) | ||
| # Verify | ||
| assert result["success"] is False | ||
| assert "error" in result | ||
| assert result["error"] == "API Error" | ||
| @pytest.mark.asyncio | ||
| async def test_get_contract_summary(self, contract_service, sample_contracts): | ||
| """Test getting contract summary.""" | ||
| # Setup mocks | ||
| contract_service._db_service.get_contracts.return_value = sample_contracts | ||
| # Execute | ||
| result = await contract_service.get_contract_summary("ES001234567890123456AB") | ||
| # Verify | ||
| assert result["contract_p1_kW"] == 5.0 # From the most recent contract (2024) | ||
| assert result["contract_p2_kW"] == 5.0 | ||
| @pytest.mark.asyncio | ||
| async def test_get_contract_summary_no_data(self, contract_service): | ||
| """Test getting contract summary with no data.""" | ||
| # Setup mocks | ||
| contract_service._db_service.get_contracts.return_value = [] | ||
| # Execute | ||
| result = await contract_service.get_contract_summary("ES001234567890123456AB") | ||
| # Verify | ||
| assert result["contract_p1_kW"] is None | ||
| assert result["contract_p2_kW"] is None |
| """Tests for DatabaseService.""" | ||
| import os | ||
| import shutil | ||
| import tempfile | ||
| from datetime import datetime | ||
| from unittest.mock import patch | ||
| import pytest | ||
| import pytest_asyncio | ||
| from edata.models.database import ( | ||
| ConsumptionModel, | ||
| ContractModel, | ||
| MaxPowerModel, | ||
| SupplyModel, | ||
| ) | ||
| from edata.services.database import get_database_service | ||
| class TestDatabaseService: | ||
| """Test suite for DatabaseService.""" | ||
| @pytest.fixture | ||
| def temp_dir(self): | ||
| """Create a temporary directory for tests.""" | ||
| temp_dir = tempfile.mkdtemp() | ||
| yield temp_dir | ||
| shutil.rmtree(temp_dir) | ||
| @pytest_asyncio.fixture | ||
| async def db_service(self, temp_dir): | ||
| """Create a database service for testing.""" | ||
| # Create a new instance directly instead of using the global singleton | ||
| from edata.services.database import DatabaseService | ||
| db_service = DatabaseService(temp_dir) | ||
| await db_service.create_tables() | ||
| yield db_service | ||
| @pytest.fixture | ||
| def sample_supply_data(self): | ||
| """Sample supply data for testing.""" | ||
| return { | ||
| "cups": "ES1234567890123456789", | ||
| "date_start": datetime(2024, 1, 1), | ||
| "date_end": datetime(2024, 12, 31), | ||
| "address": "Test Address 123", | ||
| "postal_code": "12345", | ||
| "province": "Test Province", | ||
| "municipality": "Test Municipality", | ||
| "distributor": "Test Distributor", | ||
| "point_type": 5, | ||
| "distributor_code": "123", | ||
| } | ||
| @pytest.fixture | ||
| def sample_contract_data(self): | ||
| """Sample contract data for testing.""" | ||
| return { | ||
| "cups": "ES1234567890123456789", | ||
| "date_start": datetime(2024, 1, 1), | ||
| "date_end": datetime(2024, 12, 31), | ||
| "marketer": "Test Marketer", | ||
| "distributor_code": "123", | ||
| "power_p1": 4.4, | ||
| "power_p2": 4.4, | ||
| } | ||
| @pytest.fixture | ||
| def sample_consumption_data(self): | ||
| """Sample consumption data for testing.""" | ||
| return { | ||
| "cups": "ES1234567890123456789", | ||
| "datetime": datetime(2024, 6, 15, 12, 0), | ||
| "delta_h": 1.0, | ||
| "value_kwh": 0.5, | ||
| "surplus_kwh": 0.0, | ||
| "real": True, | ||
| } | ||
| @pytest.fixture | ||
| def sample_maxpower_data(self): | ||
| """Sample maxpower data for testing.""" | ||
| return { | ||
| "cups": "ES1234567890123456789", | ||
| "datetime": datetime(2024, 6, 15, 15, 30), | ||
| "value_kw": 3.2, | ||
| } | ||
| @pytest.mark.asyncio | ||
| async def test_database_initialization(self, temp_dir): | ||
| """Test database service initialization.""" | ||
| service = await get_database_service(storage_dir=temp_dir) | ||
| # Check that database file was created | ||
| expected_db_path = os.path.join(temp_dir, "edata.db") | ||
| assert os.path.exists(expected_db_path) | ||
| # Check that we can get a session | ||
| session = service.get_session() | ||
| assert session is not None | ||
| await session.close() | ||
| @pytest.mark.asyncio | ||
| async def test_save_and_get_supply(self, db_service, sample_supply_data): | ||
| """Test saving and retrieving supply data.""" | ||
| # Save supply | ||
| saved_supply = await db_service.save_supply(sample_supply_data) | ||
| assert saved_supply.cups == sample_supply_data["cups"] | ||
| assert saved_supply.distributor == sample_supply_data["distributor"] | ||
| assert saved_supply.point_type == sample_supply_data["point_type"] | ||
| assert saved_supply.created_at is not None | ||
| assert saved_supply.updated_at is not None | ||
| # Get supply | ||
| retrieved_supply = await db_service.get_supply(sample_supply_data["cups"]) | ||
| assert retrieved_supply is not None | ||
| assert retrieved_supply.cups == sample_supply_data["cups"] | ||
| assert retrieved_supply.distributor == sample_supply_data["distributor"] | ||
| @pytest.mark.asyncio | ||
| async def test_update_existing_supply(self, db_service, sample_supply_data): | ||
| """Test updating an existing supply.""" | ||
| # Save initial supply | ||
| await db_service.save_supply(sample_supply_data) | ||
| # Update supply data | ||
| updated_data = sample_supply_data.copy() | ||
| updated_data["distributor"] = "Updated Distributor" | ||
| # Save updated supply | ||
| updated_supply = await db_service.save_supply(updated_data) | ||
| assert updated_supply.distributor == "Updated Distributor" | ||
| assert updated_supply.cups == sample_supply_data["cups"] | ||
| # Verify only one supply exists | ||
| retrieved_supply = await db_service.get_supply(sample_supply_data["cups"]) | ||
| assert retrieved_supply.distributor == "Updated Distributor" | ||
| @pytest.mark.asyncio | ||
| async def test_save_and_get_contract( | ||
| self, db_service, sample_supply_data, sample_contract_data | ||
| ): | ||
| """Test saving and retrieving contract data.""" | ||
| # Save supply first (foreign key dependency) | ||
| await db_service.save_supply(sample_supply_data) | ||
| # Save contract | ||
| saved_contract = await db_service.save_contract(sample_contract_data) | ||
| assert saved_contract.cups == sample_contract_data["cups"] | ||
| assert saved_contract.marketer == sample_contract_data["marketer"] | ||
| assert saved_contract.power_p1 == sample_contract_data["power_p1"] | ||
| assert saved_contract.id is not None | ||
| # Get contracts | ||
| contracts = await db_service.get_contracts(sample_contract_data["cups"]) | ||
| assert len(contracts) == 1 | ||
| assert contracts[0].marketer == sample_contract_data["marketer"] | ||
| @pytest.mark.asyncio | ||
| async def test_contract_unique_constraint( | ||
| self, db_service, sample_supply_data, sample_contract_data | ||
| ): | ||
| """Test that contract unique constraint works (cups + date_start).""" | ||
| # Save supply first | ||
| await db_service.save_supply(sample_supply_data) | ||
| # Save first contract | ||
| await db_service.save_contract(sample_contract_data) | ||
| # Try to save contract with same cups + date_start but different data | ||
| updated_contract_data = sample_contract_data.copy() | ||
| updated_contract_data["marketer"] = "Different Marketer" | ||
| updated_contract_data["power_p1"] = 6.6 | ||
| # This should update, not create new | ||
| await db_service.save_contract(updated_contract_data) | ||
| # Should still have only one contract, but with updated data | ||
| contracts = await db_service.get_contracts(sample_contract_data["cups"]) | ||
| assert len(contracts) == 1 | ||
| assert contracts[0].marketer == "Different Marketer" | ||
| assert contracts[0].power_p1 == 6.6 | ||
| @pytest.mark.asyncio | ||
| async def test_save_and_get_consumption( | ||
| self, db_service, sample_supply_data, sample_consumption_data | ||
| ): | ||
| """Test saving and retrieving consumption data.""" | ||
| # Save supply first | ||
| await db_service.save_supply(sample_supply_data) | ||
| # Save consumption | ||
| saved_consumption = await db_service.save_consumption(sample_consumption_data) | ||
| assert saved_consumption.cups == sample_consumption_data["cups"] | ||
| assert saved_consumption.value_kwh == sample_consumption_data["value_kwh"] | ||
| assert saved_consumption.real == sample_consumption_data["real"] | ||
| assert saved_consumption.id is not None | ||
| # Get consumptions | ||
| consumptions = await db_service.get_consumptions( | ||
| sample_consumption_data["cups"] | ||
| ) | ||
| assert len(consumptions) == 1 | ||
| assert consumptions[0].value_kwh == sample_consumption_data["value_kwh"] | ||
| @pytest.mark.asyncio | ||
| async def test_get_consumptions_with_date_filter( | ||
| self, db_service, sample_supply_data, sample_consumption_data | ||
| ): | ||
| """Test getting consumptions with date range filter.""" | ||
| # Save supply first | ||
| await db_service.save_supply(sample_supply_data) | ||
| # Save multiple consumptions with different dates | ||
| consumption1 = sample_consumption_data.copy() | ||
| consumption1["datetime"] = datetime(2024, 6, 15, 10, 0) | ||
| consumption2 = sample_consumption_data.copy() | ||
| consumption2["datetime"] = datetime(2024, 6, 16, 10, 0) | ||
| consumption3 = sample_consumption_data.copy() | ||
| consumption3["datetime"] = datetime(2024, 6, 17, 10, 0) | ||
| await db_service.save_consumption(consumption1) | ||
| await db_service.save_consumption(consumption2) | ||
| await db_service.save_consumption(consumption3) | ||
| # Get consumptions with date filter | ||
| start_date = datetime(2024, 6, 15, 12, 0) # After first consumption | ||
| end_date = datetime(2024, 6, 16, 12, 0) # Before third consumption | ||
| filtered_consumptions = await db_service.get_consumptions( | ||
| cups=sample_consumption_data["cups"], | ||
| start_date=start_date, | ||
| end_date=end_date, | ||
| ) | ||
| # Should only get the second consumption | ||
| assert len(filtered_consumptions) == 1 | ||
| assert filtered_consumptions[0].datetime == datetime(2024, 6, 16, 10, 0) | ||
| @pytest.mark.asyncio | ||
| async def test_save_and_get_maxpower( | ||
| self, db_service, sample_supply_data, sample_maxpower_data | ||
| ): | ||
| """Test saving and retrieving maxpower data.""" | ||
| # Save supply first | ||
| await db_service.save_supply(sample_supply_data) | ||
| # Save maxpower | ||
| saved_maxpower = await db_service.save_maxpower(sample_maxpower_data) | ||
| assert saved_maxpower.cups == sample_maxpower_data["cups"] | ||
| assert saved_maxpower.value_kw == sample_maxpower_data["value_kw"] | ||
| assert saved_maxpower.id is not None | ||
| # Get maxpower readings | ||
| maxpower_readings = await db_service.get_maxpower_readings( | ||
| sample_maxpower_data["cups"] | ||
| ) | ||
| assert len(maxpower_readings) == 1 | ||
| assert maxpower_readings[0].value_kw == sample_maxpower_data["value_kw"] | ||
| @pytest.mark.asyncio | ||
| async def test_consumption_unique_constraint( | ||
| self, db_service, sample_supply_data, sample_consumption_data | ||
| ): | ||
| """Test that consumption unique constraint works (cups + datetime).""" | ||
| # Save supply first | ||
| await db_service.save_supply(sample_supply_data) | ||
| # Save first consumption | ||
| await db_service.save_consumption(sample_consumption_data) | ||
| # Try to save consumption with same cups + datetime but different value | ||
| updated_consumption = sample_consumption_data.copy() | ||
| updated_consumption["value_kwh"] = 1.5 | ||
| # This should update, not create new | ||
| await db_service.save_consumption(updated_consumption) | ||
| # Should still have only one consumption, but with updated value | ||
| consumptions = await db_service.get_consumptions( | ||
| sample_consumption_data["cups"] | ||
| ) | ||
| assert len(consumptions) == 1 | ||
| assert consumptions[0].value_kwh == 1.5 | ||
| @pytest.mark.asyncio | ||
| async def test_save_from_pydantic_models(self, db_service): | ||
| """Test saving data from Pydantic models.""" | ||
| cups = "ES1234567890123456789" | ||
| # Create Pydantic models | ||
| supply = SupplyModel( | ||
| cups=cups, | ||
| date_start=datetime(2024, 1, 1), | ||
| date_end=datetime(2024, 12, 31), | ||
| address="Test Address", | ||
| postal_code="12345", | ||
| province="Test Province", | ||
| municipality="Test Municipality", | ||
| distributor="Test Distributor", | ||
| point_type=5, | ||
| distributor_code="123", | ||
| ) | ||
| contract = ContractModel( | ||
| cups=cups, | ||
| date_start=datetime(2024, 1, 1), | ||
| date_end=datetime(2024, 12, 31), | ||
| marketer="Test Marketer", | ||
| distributor_code="123", | ||
| power_p1=4.4, | ||
| power_p2=4.4, | ||
| ) | ||
| consumption = ConsumptionModel( | ||
| cups=cups, datetime=datetime(2024, 6, 15, 12, 0), delta_h=1.0, value_kwh=0.5 | ||
| ) | ||
| maxpower = MaxPowerModel( | ||
| cups=cups, datetime=datetime(2024, 6, 15, 15, 30), value_kw=3.2 | ||
| ) | ||
| # Save using the batch method | ||
| await db_service.save_from_pydantic_models( | ||
| cups=cups, | ||
| supplies=[supply], | ||
| contracts=[contract], | ||
| consumptions=[consumption], | ||
| maximeter=[maxpower], | ||
| ) | ||
| # Verify data was saved | ||
| saved_supply = await db_service.get_supply(cups) | ||
| assert saved_supply is not None | ||
| assert saved_supply.cups == cups | ||
| saved_contracts = await db_service.get_contracts(cups) | ||
| assert len(saved_contracts) == 1 | ||
| assert saved_contracts[0].marketer == "Test Marketer" | ||
| saved_consumptions = await db_service.get_consumptions(cups) | ||
| assert len(saved_consumptions) == 1 | ||
| assert saved_consumptions[0].value_kwh == 0.5 | ||
| saved_maxpower = await db_service.get_maxpower_readings(cups) | ||
| assert len(saved_maxpower) == 1 | ||
| assert saved_maxpower[0].value_kw == 3.2 | ||
| @pytest.mark.asyncio | ||
| async def test_database_relationships( | ||
| self, db_service, sample_supply_data, sample_contract_data | ||
| ): | ||
| """Test that database relationships work correctly.""" | ||
| # Save supply and contract | ||
| await db_service.save_supply(sample_supply_data) | ||
| await db_service.save_contract(sample_contract_data) | ||
| # Get supply with relationships (this would work if we load with relationships) | ||
| supply = await db_service.get_supply(sample_supply_data["cups"]) | ||
| assert supply is not None | ||
| assert supply.cups == sample_supply_data["cups"] | ||
| # Verify foreign key constraint works | ||
| contracts = await db_service.get_contracts(sample_supply_data["cups"]) | ||
| assert len(contracts) == 1 | ||
| assert contracts[0].cups == sample_supply_data["cups"] | ||
| @pytest.mark.asyncio | ||
| async def test_invalid_cups_foreign_key(self, db_service, sample_contract_data): | ||
| """Test that foreign key constraint prevents orphaned records.""" | ||
| # Try to save contract without supply (should fail or handle gracefully) | ||
| # Note: This depends on SQLite foreign key enforcement | ||
| try: | ||
| await db_service.save_contract(sample_contract_data) | ||
| # If it doesn't raise an error, verify the record wasn't actually saved | ||
| # or that the database handles it appropriately | ||
| except Exception: | ||
| # Expected if foreign key constraints are enforced | ||
| pass | ||
| @pytest.mark.asyncio | ||
| async def test_default_storage_dir(self): | ||
| """Test that default storage directory is used when none provided.""" | ||
| import tempfile | ||
| test_dir = tempfile.mkdtemp() | ||
| try: | ||
| # Reset the global singleton to allow testing default directory | ||
| import edata.services.database | ||
| edata.services.database._db_service = None | ||
| with patch("edata.services.database.DEFAULT_STORAGE_DIR", test_dir): | ||
| service = await get_database_service() | ||
| # Check that service was created with the correct directory | ||
| assert service._db_dir == test_dir | ||
| assert os.path.exists(service._db_dir) | ||
| finally: | ||
| # Clean up | ||
| if os.path.exists(test_dir): | ||
| shutil.rmtree(test_dir) |
| """Tests for MaximeterService.""" | ||
| from datetime import datetime | ||
| from unittest.mock import AsyncMock, Mock, patch | ||
| import pytest | ||
| import pytest_asyncio | ||
| from edata.models.database import MaxPowerModel | ||
| from edata.services.maximeter import MaximeterService | ||
| @pytest_asyncio.fixture | ||
| async def maximeter_service(): | ||
| """Create a maximeter service with mocked dependencies.""" | ||
| with patch("edata.services.maximeter.get_database_service") as mock_db_factory: | ||
| mock_db = Mock() | ||
| # Hacer que los métodos de la base de datos retornen AsyncMock | ||
| mock_db.get_maxpower = AsyncMock(return_value=[]) | ||
| mock_db.save_maxpower = AsyncMock(return_value=Mock()) | ||
| mock_db_factory.return_value = mock_db | ||
| # Create a mock DatadisConnector | ||
| mock_datadis_connector = Mock() | ||
| service = MaximeterService(datadis_connector=mock_datadis_connector) | ||
| service._db_service = mock_db | ||
| return service | ||
| @pytest.fixture | ||
| def sample_maximeter(): | ||
| """Sample maximeter data for testing.""" | ||
| return [ | ||
| MaxPowerModel( | ||
| cups="ES001234567890123456AB", | ||
| datetime=datetime(2023, 1, 15, 14, 30), | ||
| value_kw=2.5, | ||
| ), | ||
| MaxPowerModel( | ||
| cups="ES001234567890123456AB", | ||
| datetime=datetime(2023, 2, 20, 16, 45), | ||
| value_kw=3.2, | ||
| ), | ||
| MaxPowerModel( | ||
| cups="ES001234567890123456AB", | ||
| datetime=datetime(2023, 3, 10, 12, 15), | ||
| value_kw=1.8, | ||
| ), | ||
| ] | ||
| class TestMaximeterService: | ||
| """Test class for MaximeterService.""" | ||
| @pytest.mark.asyncio | ||
| async def test_get_maximeter_summary(self, maximeter_service, sample_maximeter): | ||
| """Test getting maximeter summary.""" | ||
| # Setup mocks | ||
| maximeter_service.get_stored_maxpower = AsyncMock(return_value=sample_maximeter) | ||
| # Execute | ||
| result = await maximeter_service.get_maximeter_summary("ES001234567890123456AB") | ||
| # Verify | ||
| assert result["max_power_kW"] == 3.2 # max value | ||
| assert result["max_power_date"] == datetime(2023, 2, 20, 16, 45) | ||
| assert result["max_power_mean_kW"] == 2.5 # (2.5 + 3.2 + 1.8) / 3 | ||
| assert result["max_power_90perc_kW"] == 3.2 # 90th percentile | ||
| @pytest.mark.asyncio | ||
| async def test_get_maximeter_summary_no_data(self, maximeter_service): | ||
| """Test getting maximeter summary with no data.""" | ||
| # Setup mocks | ||
| maximeter_service.get_stored_maxpower = AsyncMock(return_value=[]) | ||
| # Execute | ||
| result = await maximeter_service.get_maximeter_summary("ES001234567890123456AB") | ||
| # Verify | ||
| assert result["max_power_kW"] is None | ||
| assert result["max_power_date"] is None | ||
| assert result["max_power_mean_kW"] is None | ||
| assert result["max_power_90perc_kW"] is None |
| """Tests for SupplyService.""" | ||
| from datetime import datetime | ||
| from unittest.mock import AsyncMock, Mock, patch | ||
| import pytest | ||
| import pytest_asyncio | ||
| from edata.models.database import SupplyModel | ||
| from edata.services.supply import SupplyService | ||
| @pytest_asyncio.fixture | ||
| async def supply_service(): | ||
| """Create a supply service with mocked dependencies.""" | ||
| with patch("edata.services.supply.get_database_service") as mock_db_factory: | ||
| mock_db = Mock() | ||
| # Hacer que los métodos de la base de datos retornen AsyncMock | ||
| mock_db.get_supplies = AsyncMock(return_value=[]) | ||
| mock_db.save_supply = AsyncMock(return_value=Mock()) | ||
| mock_db_factory.return_value = mock_db | ||
| # Create a mock DatadisConnector | ||
| mock_datadis_connector = Mock() | ||
| service = SupplyService(datadis_connector=mock_datadis_connector) | ||
| service._db_service = mock_db | ||
| return service | ||
| @pytest.fixture | ||
| def sample_supplies(): | ||
| """Sample supply data for testing.""" | ||
| return [ | ||
| SupplyModel( | ||
| cups="ES001234567890123456AB", | ||
| distributor_code="123", | ||
| point_type=5, | ||
| date_start=datetime(2023, 1, 1), | ||
| date_end=datetime(2024, 12, 31), | ||
| address="Test Address 1", | ||
| postal_code="12345", | ||
| province="Test Province 1", | ||
| municipality="Test Municipality 1", | ||
| distributor="Test Distributor 1", | ||
| ), | ||
| SupplyModel( | ||
| cups="ES987654321098765432BA", | ||
| distributor_code="456", | ||
| point_type=4, | ||
| date_start=datetime(2023, 6, 1), | ||
| date_end=datetime(2025, 6, 1), | ||
| address="Test Address 2", | ||
| postal_code="67890", | ||
| province="Test Province 2", | ||
| municipality="Test Municipality 2", | ||
| distributor="Test Distributor 2", | ||
| ), | ||
| ] | ||
| class TestSupplyService: | ||
| """Test class for SupplyService.""" | ||
| @pytest.mark.asyncio | ||
| async def test_update_supplies_success(self, supply_service): | ||
| """Test successful supply update.""" | ||
| # Setup mocks - now returns Pydantic models | ||
| supply_service._datadis.get_supplies = AsyncMock( | ||
| return_value=[ | ||
| SupplyModel( | ||
| cups="ES001234567890123456AB", | ||
| distributor_code="123", | ||
| point_type=5, | ||
| date_start=datetime(2023, 1, 1), | ||
| date_end=datetime(2024, 12, 31), | ||
| address="Test Address", | ||
| postal_code="12345", | ||
| province="Test Province", | ||
| municipality="Test Municipality", | ||
| distributor="Test Distributor", | ||
| ) | ||
| ] | ||
| ) | ||
| supply_service._db_service.get_supplies.side_effect = [ | ||
| [], | ||
| [Mock()], | ||
| ] # No existing, then 1 stored | ||
| supply_service._db_service.save_supply.return_value = Mock() | ||
| # Execute | ||
| result = await supply_service.update_supplies() | ||
| # Verify | ||
| assert result["success"] is True | ||
| assert result["stats"]["fetched"] == 1 | ||
| assert result["stats"]["saved"] == 1 | ||
| assert result["stats"]["updated"] == 0 | ||
| supply_service._datadis.get_supplies.assert_called_once() | ||
| supply_service._db_service.save_supply.assert_called_once() | ||
| @pytest.mark.asyncio | ||
| async def test_get_supplies(self, supply_service, sample_supplies): | ||
| """Test getting supplies.""" | ||
| # Setup mocks | ||
| supply_service._db_service.get_supplies.return_value = sample_supplies | ||
| # Execute | ||
| result = await supply_service.get_supplies() | ||
| # Verify | ||
| assert len(result) == 2 | ||
| assert result[0].cups == "ES001234567890123456AB" | ||
| assert result[1].cups == "ES987654321098765432BA" | ||
| supply_service._db_service.get_supplies.assert_called_once_with(cups=None) | ||
| @pytest.mark.asyncio | ||
| async def test_get_supply_by_cups(self, supply_service, sample_supplies): | ||
| """Test getting supply by CUPS.""" | ||
| # Setup mocks | ||
| supply_service._db_service.get_supplies.return_value = [sample_supplies[0]] | ||
| # Execute | ||
| result = await supply_service.get_supply_by_cups("ES001234567890123456AB") | ||
| # Verify | ||
| assert result is not None | ||
| assert result.cups == "ES001234567890123456AB" | ||
| assert result.distributor == "Test Distributor 1" | ||
| supply_service._db_service.get_supplies.assert_called_once_with( | ||
| cups="ES001234567890123456AB" | ||
| ) | ||
| @pytest.mark.asyncio | ||
| async def test_get_cups_list(self, supply_service, sample_supplies): | ||
| """Test getting CUPS list.""" | ||
| # Setup mocks | ||
| supply_service._db_service.get_supplies.return_value = sample_supplies | ||
| # Execute | ||
| result = await supply_service.get_cups_list() | ||
| # Verify | ||
| assert len(result) == 2 | ||
| assert "ES001234567890123456AB" in result | ||
| assert "ES987654321098765432BA" in result | ||
| @pytest.mark.asyncio | ||
| async def test_get_active_supplies(self, supply_service, sample_supplies): | ||
| """Test getting active supplies.""" | ||
| # Setup mocks | ||
| supply_service._db_service.get_supplies.return_value = sample_supplies | ||
| # Execute - test with date in 2024 (both should be active) | ||
| result = await supply_service.get_active_supplies(datetime(2024, 6, 15)) | ||
| # Verify | ||
| assert len(result) == 2 # Both supplies should be active in 2024 | ||
| for supply in result: | ||
| assert supply.date_start <= datetime(2024, 6, 15) <= supply.date_end | ||
| @pytest.mark.asyncio | ||
| async def test_get_supply_stats(self, supply_service, sample_supplies): | ||
| """Test getting supply statistics.""" | ||
| # Setup mocks | ||
| supply_service._db_service.get_supplies.return_value = sample_supplies | ||
| # Execute | ||
| result = await supply_service.get_supply_stats() | ||
| # Verify | ||
| # Verify | ||
| assert result["total_supplies"] == 2 | ||
| assert result["total_cups"] == 2 | ||
| assert result["date_range"]["earliest_start"] == datetime(2023, 1, 1) | ||
| assert result["date_range"]["latest_end"] == datetime(2025, 6, 1) | ||
| assert result["point_types"] == {5: 1, 4: 1} | ||
| assert result["distributors"] == { | ||
| "Test Distributor 1": 1, | ||
| "Test Distributor 2": 1, | ||
| } | ||
| @pytest.mark.asyncio | ||
| async def test_validate_cups(self, supply_service, sample_supplies): | ||
| """Test CUPS validation.""" | ||
| # Setup mocks | ||
| supply_service._db_service.get_supplies.return_value = [sample_supplies[0]] | ||
| # Execute | ||
| result = await supply_service.validate_cups("ES001234567890123456AB") | ||
| # Verify | ||
| assert result is True | ||
| # Test invalid CUPS | ||
| supply_service._db_service.get_supplies.return_value = [] | ||
| result = await supply_service.validate_cups("INVALID_CUPS") | ||
| assert result is False | ||
| @pytest.mark.asyncio | ||
| async def test_get_distributor_code(self, supply_service, sample_supplies): | ||
| """Test getting distributor code.""" | ||
| # Setup mocks | ||
| supply_service._db_service.get_supplies.return_value = [sample_supplies[0]] | ||
| # Execute | ||
| result = await supply_service.get_distributor_code("ES001234567890123456AB") | ||
| # Verify | ||
| assert result == "123" | ||
| @pytest.mark.asyncio | ||
| async def test_get_point_type(self, supply_service, sample_supplies): | ||
| """Test getting point type.""" | ||
| # Setup mocks | ||
| supply_service._db_service.get_supplies.return_value = [sample_supplies[0]] | ||
| # Execute | ||
| result = await supply_service.get_point_type("ES001234567890123456AB") | ||
| # Verify | ||
| assert result == 5 | ||
| @pytest.mark.asyncio | ||
| async def test_update_supplies_no_data(self, supply_service): | ||
| """Test supply update with no data returned.""" | ||
| # Setup mocks | ||
| supply_service._datadis.get_supplies = AsyncMock(return_value=[]) | ||
| # Execute | ||
| result = await supply_service.update_supplies() | ||
| # Verify | ||
| assert result["success"] is True | ||
| assert result["stats"]["fetched"] == 0 | ||
| assert result["stats"]["saved"] == 0 | ||
| @pytest.mark.asyncio | ||
| async def test_update_supplies_error(self, supply_service): | ||
| """Test supply update with error.""" | ||
| # Setup mocks | ||
| supply_service._datadis.get_supplies = AsyncMock( | ||
| side_effect=Exception("API Error") | ||
| ) | ||
| # Execute | ||
| result = await supply_service.update_supplies() | ||
| # Verify | ||
| assert result["success"] is False | ||
| assert "error" in result | ||
| assert result["error"] == "API Error" |
-164
| """Utility functions for edata package.""" | ||
| import contextlib | ||
| import json | ||
| import logging | ||
| import math | ||
| from copy import deepcopy | ||
| from datetime import date, datetime, timedelta | ||
| from json import JSONEncoder | ||
| from typing import Any, Dict, List, Optional | ||
| import holidays | ||
| _LOGGER = logging.getLogger(__name__) | ||
| # PVPC tariff constants | ||
| HOURS_P1 = [10, 11, 12, 13, 18, 19, 20, 21] | ||
| HOURS_P2 = [8, 9, 14, 15, 16, 17, 22, 23] | ||
| WEEKDAYS_P3 = [5, 6] | ||
| def get_pvpc_tariff(a_datetime: datetime) -> str: | ||
| """Evaluate the PVPC tariff for a given datetime. | ||
| Args: | ||
| a_datetime: The datetime to evaluate | ||
| Returns: | ||
| The tariff period: "p1", "p2", or "p3" | ||
| """ | ||
| hdays = holidays.country_holidays("ES") | ||
| hour = a_datetime.hour | ||
| weekday = a_datetime.weekday() | ||
| if weekday in WEEKDAYS_P3 or a_datetime.date() in hdays: | ||
| return "p3" | ||
| elif hour in HOURS_P1: | ||
| return "p1" | ||
| elif hour in HOURS_P2: | ||
| return "p2" | ||
| else: | ||
| return "p3" | ||
| def extend_by_key( | ||
| old_lst: List[Dict[str, Any]], new_lst: List[Dict[str, Any]], key: str | ||
| ) -> List[Dict[str, Any]]: | ||
| """Extend a list of dicts by key.""" | ||
| lst = deepcopy(old_lst) | ||
| temp_list = [] | ||
| for new_element in new_lst: | ||
| for old_element in lst: | ||
| if new_element[key] == old_element[key]: | ||
| for i in old_element: | ||
| old_element[i] = new_element[i] | ||
| break | ||
| else: | ||
| temp_list.append(new_element) | ||
| lst.extend(temp_list) | ||
| return lst | ||
| def extract_dt_ranges( | ||
| lst: List[Dict[str, Any]], | ||
| dt_from: datetime, | ||
| dt_to: datetime, | ||
| gap_interval: timedelta = timedelta(hours=1), | ||
| ) -> tuple: | ||
| """Filter a list of dicts between two datetimes.""" | ||
| new_lst = [] | ||
| missing = [] | ||
| oldest_dt = None | ||
| newest_dt = None | ||
| last_dt = None | ||
| if len(lst) > 0: | ||
| sorted_lst = sorted(lst, key=lambda i: i["datetime"]) | ||
| last_dt = dt_from | ||
| for i in sorted_lst: | ||
| if dt_from <= i["datetime"] <= dt_to: | ||
| if (i["datetime"] - last_dt) > gap_interval: | ||
| missing.append({"from": last_dt, "to": i["datetime"]}) | ||
| if i.get("value_kWh", 1) > 0: | ||
| if oldest_dt is None or i["datetime"] < oldest_dt: | ||
| oldest_dt = i["datetime"] | ||
| if newest_dt is None or i["datetime"] > newest_dt: | ||
| newest_dt = i["datetime"] | ||
| if i["datetime"] != last_dt: # remove duplicates | ||
| new_lst.append(i) | ||
| last_dt = i["datetime"] | ||
| if dt_to > last_dt: | ||
| missing.append({"from": last_dt, "to": dt_to}) | ||
| _LOGGER.debug("found data from %s to %s", oldest_dt, newest_dt) | ||
| else: | ||
| missing.append({"from": dt_from, "to": dt_to}) | ||
| return new_lst, missing | ||
| def get_by_key( | ||
| lst: List[Dict[str, Any]], key: str, value: Any | ||
| ) -> Optional[Dict[str, Any]]: | ||
| """Obtain an element of a list of dicts by key=value.""" | ||
| for i in lst: | ||
| if i[key] == value: | ||
| return i | ||
| return None | ||
| def serialize_dict(data: dict) -> dict: | ||
| """Serialize dicts as json.""" | ||
| class DateTimeEncoder(JSONEncoder): | ||
| """Replace datetime objects with ISO strings.""" | ||
| def default(self, o): | ||
| if isinstance(o, (date, datetime)): | ||
| return o.isoformat() | ||
| return json.loads(json.dumps(data, cls=DateTimeEncoder)) | ||
| def deserialize_dict(serialized_dict: dict) -> dict: | ||
| """Deserializes a json replacing ISOTIME strings into datetime.""" | ||
| def datetime_parser(json_dict): | ||
| """Parse JSON while converting ISO strings into datetime objects.""" | ||
| for key, value in json_dict.items(): | ||
| if "date" in key: | ||
| with contextlib.suppress(Exception): | ||
| json_dict[key] = datetime.fromisoformat(value) | ||
| return json_dict | ||
| return json.loads(json.dumps(serialized_dict), object_hook=datetime_parser) | ||
| def percentile(N: List, percent: float, key=lambda x: x): | ||
| """Find the percentile of a list of values.""" | ||
| if not N: | ||
| return None | ||
| k = (len(N) - 1) * percent | ||
| f = math.floor(k) | ||
| c = math.ceil(k) | ||
| if f == c: | ||
| return key(N[int(k)]) | ||
| d0 = key(N[int(f)]) * (c - k) | ||
| d1 = key(N[int(c)]) * (k - f) | ||
| return d0 + d1 | ||
| def extend_and_filter( | ||
| old_lst: List[Dict[str, Any]], | ||
| new_lst: List[Dict[str, Any]], | ||
| key: str, | ||
| dt_from: datetime, | ||
| dt_to: datetime, | ||
| ) -> List[Dict[str, Any]]: | ||
| """Extend and filter data by datetime range.""" | ||
| data = extend_by_key(old_lst, new_lst, key) | ||
| data, _ = extract_dt_ranges( | ||
| data, | ||
| dt_from, | ||
| dt_to, | ||
| gap_interval=timedelta(days=365), # trick | ||
| ) | ||
| return data |
-51
| .PHONY: help install install-dev test lint format build clean publish publish-test | ||
| help: ## Show this help message | ||
| @grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-20s\033[0m %s\n", $$1, $$2}' | ||
| install: ## Install the package in development mode | ||
| pip install -e . | ||
| install-dev: ## Install with development dependencies | ||
| pip install -e ".[dev,test]" | ||
| test: ## Run tests | ||
| pytest | ||
| test-cov: ## Run tests with coverage | ||
| pytest --cov=edata --cov-report=html --cov-report=term | ||
| lint: ## Run linting checks | ||
| flake8 edata/ | ||
| mypy edata/ | ||
| format: ## Format code with black | ||
| black edata/ | ||
| format-check: ## Check if code is formatted | ||
| black --check edata/ | ||
| clean: ## Clean build artifacts | ||
| rm -rf build/ | ||
| rm -rf dist/ | ||
| rm -rf *.egg-info/ | ||
| find . -type d -name __pycache__ -exec rm -rf {} + | ||
| find . -type f -name "*.pyc" -delete | ||
| build: ## Build the package | ||
| python -m build | ||
| publish-test: ## Publish to TestPyPI | ||
| python -m twine upload --repository testpypi dist/* | ||
| publish: ## Publish to PyPI | ||
| python -m twine upload dist/* | ||
| # Combined workflow commands | ||
| dev-setup: install-dev ## Setup development environment | ||
| pre-commit: format lint test ## Run all pre-commit checks | ||
| release: clean build publish ## Build and publish to PyPI | ||
| release-test: clean build publish-test ## Build and publish to TestPyPI |
Alert delta unavailable
Currently unable to show alert delta for PyPI packages.
192282
-53.01%25
-45.65%1396
-82.78%