././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.9301832 logilab_common-2.1.0/0000755000000000000000000000000014762603767013253 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/COPYING0000666000000000000000000004310314762603732014303 0ustar00rootroot GNU GENERAL PUBLIC LICENSE Version 2, June 1991 Copyright (C) 1989, 1991 Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This General Public License applies to most of the Free Software Foundation's software and to any other program whose authors commit to using it. (Some other Free Software Foundation software is covered by the GNU Lesser General Public License instead.) You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things. To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it. For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. We protect your rights with two steps: (1) copyright the software, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the software. Also, for each author's protection and ours, we want to make certain that everyone understands that there is no warranty for this free software. If the software is modified by someone else and passed on, we want its recipients to know that what they have is not the original, so that any problems introduced by others will not reflect on the original authors' reputations. Finally, any free program is threatened constantly by software patents. We wish to avoid the danger that redistributors of a free program will individually obtain patent licenses, in effect making the program proprietary. To prevent this, we have made it clear that any patent must be licensed for everyone's free use or not licensed at all. The precise terms and conditions for copying, distribution and modification follow. GNU GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License. The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language. (Hereinafter, translation is included without limitation in the term "modification".) Each licensee is addressed as "you". Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted, and the output from the Program is covered only if its contents constitute a work based on the Program (independent of having been made by running the Program). Whether that is true depends on what the Program does. 1. You may copy and distribute verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and give any other recipients of the Program a copy of this License along with the Program. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Program or any portion of it, thus forming a work based on the Program, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) You must cause the modified files to carry prominent notices stating that you changed the files and the date of any change. b) You must cause any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License. c) If the modified program normally reads commands interactively when run, you must cause it, when started running for such interactive use in the most ordinary way, to print or display an announcement including an appropriate copyright notice and a notice that there is no warranty (or else, saying that you provide a warranty) and that users may redistribute the program under these conditions, and telling the user how to view a copy of this License. (Exception: if the Program itself is interactive but does not normally print such an announcement, your work based on the Program is not required to print an announcement.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Program, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Program, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Program. In addition, mere aggregation of another work not based on the Program with the Program (or with a work based on the Program) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you also do one of the following: a) Accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, c) Accompany it with the information you received as to the offer to distribute corresponding source code. (This alternative is allowed only for noncommercial distribution and only if you received the program in object code or executable form with such an offer, in accord with Subsection b above.) The source code for a work means the preferred form of the work for making modifications to it. For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. If distribution of executable or object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place counts as distribution of the source code, even though third parties are not compelled to copy the source along with the object code. 4. You may not copy, modify, sublicense, or distribute the Program except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense or distribute the Program is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 5. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Program or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Program (or any work based on the Program), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Program or works based on it. 6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License. 7. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system, which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Program under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 9. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation. 10. If you wish to incorporate parts of the Program into other free programs whose distribution conditions are different, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. Also add information on how to contact you by electronic and paper mail. If the program is interactive, make it output a short notice like this when it starts in an interactive mode: Gnomovision version 69, Copyright (C) year name of author Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, the commands you use may be called something other than `show w' and `show c'; they could even be mouse-clicks or menu items--whatever suits your program. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the program, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision' (which makes passes at compilers) written by James Hacker. , 1 April 1989 Ty Coon, President of Vice This General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/COPYING.LESSER0000666000000000000000000006363714762603732015315 0ustar00rootroot GNU LESSER GENERAL PUBLIC LICENSE Version 2.1, February 1999 Copyright (C) 1991, 1999 Free Software Foundation, Inc. 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. [This is the first released version of the Lesser GPL. It also counts as the successor of the GNU Library Public License, version 2, hence the version number 2.1.] Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public Licenses are intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This license, the Lesser General Public License, applies to some specially designated software packages--typically libraries--of the Free Software Foundation and other authors who decide to use it. You can use it too, but we suggest you first think carefully about whether this license or the ordinary General Public License is the better strategy to use in any particular case, based on the explanations below. When we speak of free software, we are referring to freedom of use, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish); that you receive source code or can get it if you want it; that you can change the software and use pieces of it in new free programs; and that you are informed that you can do these things. To protect your rights, we need to make restrictions that forbid distributors to deny you these rights or to ask you to surrender these rights. These restrictions translate to certain responsibilities for you if you distribute copies of the library or if you modify it. For example, if you distribute copies of the library, whether gratis or for a fee, you must give the recipients all the rights that we gave you. You must make sure that they, too, receive or can get the source code. If you link other code with the library, you must provide complete object files to the recipients, so that they can relink them with the library after making changes to the library and recompiling it. And you must show them these terms so they know their rights. We protect your rights with a two-step method: (1) we copyright the library, and (2) we offer you this license, which gives you legal permission to copy, distribute and/or modify the library. To protect each distributor, we want to make it very clear that there is no warranty for the free library. Also, if the library is modified by someone else and passed on, the recipients should know that what they have is not the original version, so that the original author's reputation will not be affected by problems that might be introduced by others. Finally, software patents pose a constant threat to the existence of any free program. We wish to make sure that a company cannot effectively restrict the users of a free program by obtaining a restrictive license from a patent holder. Therefore, we insist that any patent license obtained for a version of the library must be consistent with the full freedom of use specified in this license. Most GNU software, including some libraries, is covered by the ordinary GNU General Public License. This license, the GNU Lesser General Public License, applies to certain designated libraries, and is quite different from the ordinary General Public License. We use this license for certain libraries in order to permit linking those libraries into non-free programs. When a program is linked with a library, whether statically or using a shared library, the combination of the two is legally speaking a combined work, a derivative of the original library. The ordinary General Public License therefore permits such linking only if the entire combination fits its criteria of freedom. The Lesser General Public License permits more lax criteria for linking other code with the library. We call this license the "Lesser" General Public License because it does Less to protect the user's freedom than the ordinary General Public License. It also provides other free software developers Less of an advantage over competing non-free programs. These disadvantages are the reason we use the ordinary General Public License for many libraries. However, the Lesser license provides advantages in certain special circumstances. For example, on rare occasions, there may be a special need to encourage the widest possible use of a certain library, so that it becomes a de-facto standard. To achieve this, non-free programs must be allowed to use the library. A more frequent case is that a free library does the same job as widely used non-free libraries. In this case, there is little to gain by limiting the free library to free software only, so we use the Lesser General Public License. In other cases, permission to use a particular library in non-free programs enables a greater number of people to use a large body of free software. For example, permission to use the GNU C Library in non-free programs enables many more people to use the whole GNU operating system, as well as its variant, the GNU/Linux operating system. Although the Lesser General Public License is Less protective of the users' freedom, it does ensure that the user of a program that is linked with the Library has the freedom and the wherewithal to run that program using a modified version of the Library. The precise terms and conditions for copying, distribution and modification follow. Pay close attention to the difference between a "work based on the library" and a "work that uses the library". The former contains code derived from the library, whereas the latter must be combined with the library in order to run. GNU LESSER GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License Agreement applies to any software library or other program which contains a notice placed by the copyright holder or other authorized party saying it may be distributed under the terms of this Lesser General Public License (also called "this License"). Each licensee is addressed as "you". A "library" means a collection of software functions and/or data prepared so as to be conveniently linked with application programs (which use some of those functions and data) to form executables. The "Library", below, refers to any such software library or work which has been distributed under these terms. A "work based on the Library" means either the Library or any derivative work under copyright law: that is to say, a work containing the Library or a portion of it, either verbatim or with modifications and/or translated straightforwardly into another language. (Hereinafter, translation is included without limitation in the term "modification".) "Source code" for a work means the preferred form of the work for making modifications to it. For a library, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the library. Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running a program using the Library is not restricted, and output from such a program is covered only if its contents constitute a work based on the Library (independent of the use of the Library in a tool for writing it). Whether that is true depends on what the Library does and what the program that uses the Library does. 1. You may copy and distribute verbatim copies of the Library's complete source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and distribute a copy of this License along with the Library. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Library or any portion of it, thus forming a work based on the Library, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) The modified work must itself be a software library. b) You must cause the files modified to carry prominent notices stating that you changed the files and the date of any change. c) You must cause the whole of the work to be licensed at no charge to all third parties under the terms of this License. d) If a facility in the modified Library refers to a function or a table of data to be supplied by an application program that uses the facility, other than as an argument passed when the facility is invoked, then you must make a good faith effort to ensure that, in the event an application does not supply such function or table, the facility still operates, and performs whatever part of its purpose remains meaningful. (For example, a function in a library to compute square roots has a purpose that is entirely well-defined independent of the application. Therefore, Subsection 2d requires that any application-supplied function or table used by this function must be optional: if the application does not supply it, the square root function must still compute square roots.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Library, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Library, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Library. In addition, mere aggregation of another work not based on the Library with the Library (or with a work based on the Library) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may opt to apply the terms of the ordinary GNU General Public License instead of this License to a given copy of the Library. To do this, you must alter all the notices that refer to this License, so that they refer to the ordinary GNU General Public License, version 2, instead of to this License. (If a newer version than version 2 of the ordinary GNU General Public License has appeared, then you can specify that version instead if you wish.) Do not make any other change in these notices. Once this change is made in a given copy, it is irreversible for that copy, so the ordinary GNU General Public License applies to all subsequent copies and derivative works made from that copy. This option is useful when you wish to copy part of the code of the Library into a program that is not a library. 4. You may copy and distribute the Library (or a portion or derivative of it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange. If distribution of object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place satisfies the requirement to distribute the source code, even though third parties are not compelled to copy the source along with the object code. 5. A program that contains no derivative of any portion of the Library, but is designed to work with the Library by being compiled or linked with it, is called a "work that uses the Library". Such a work, in isolation, is not a derivative work of the Library, and therefore falls outside the scope of this License. However, linking a "work that uses the Library" with the Library creates an executable that is a derivative of the Library (because it contains portions of the Library), rather than a "work that uses the library". The executable is therefore covered by this License. Section 6 states terms for distribution of such executables. When a "work that uses the Library" uses material from a header file that is part of the Library, the object code for the work may be a derivative work of the Library even though the source code is not. Whether this is true is especially significant if the work can be linked without the Library, or if the work is itself a library. The threshold for this to be true is not precisely defined by law. If such an object file uses only numerical parameters, data structure layouts and accessors, and small macros and small inline functions (ten lines or less in length), then the use of the object file is unrestricted, regardless of whether it is legally a derivative work. (Executables containing this object code plus portions of the Library will still fall under Section 6.) Otherwise, if the work is a derivative of the Library, you may distribute the object code for the work under the terms of Section 6. Any executables containing that work also fall under Section 6, whether or not they are linked directly with the Library itself. 6. As an exception to the Sections above, you may also combine or link a "work that uses the Library" with the Library to produce a work containing portions of the Library, and distribute that work under terms of your choice, provided that the terms permit modification of the work for the customer's own use and reverse engineering for debugging such modifications. You must give prominent notice with each copy of the work that the Library is used in it and that the Library and its use are covered by this License. You must supply a copy of this License. If the work during execution displays copyright notices, you must include the copyright notice for the Library among them, as well as a reference directing the user to the copy of this License. Also, you must do one of these things: a) Accompany the work with the complete corresponding machine-readable source code for the Library including whatever changes were used in the work (which must be distributed under Sections 1 and 2 above); and, if the work is an executable linked with the Library, with the complete machine-readable "work that uses the Library", as object code and/or source code, so that the user can modify the Library and then relink to produce a modified executable containing the modified Library. (It is understood that the user who changes the contents of definitions files in the Library will not necessarily be able to recompile the application to use the modified definitions.) b) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (1) uses at run time a copy of the library already present on the user's computer system, rather than copying library functions into the executable, and (2) will operate properly with a modified version of the library, if the user installs one, as long as the modified version is interface-compatible with the version that the work was made with. c) Accompany the work with a written offer, valid for at least three years, to give the same user the materials specified in Subsection 6a, above, for a charge no more than the cost of performing this distribution. d) If distribution of the work is made by offering access to copy from a designated place, offer equivalent access to copy the above specified materials from the same place. e) Verify that the user has already received a copy of these materials or that you have already sent this user a copy. For an executable, the required form of the "work that uses the Library" must include any data and utility programs needed for reproducing the executable from it. However, as a special exception, the materials to be distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. It may happen that this requirement contradicts the license restrictions of other proprietary libraries that do not normally accompany the operating system. Such a contradiction means you cannot use both them and the Library together in an executable that you distribute. 7. You may place library facilities that are a work based on the Library side-by-side in a single library together with other library facilities not covered by this License, and distribute such a combined library, provided that the separate distribution of the work based on the Library and of the other library facilities is otherwise permitted, and provided that you do these two things: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities. This must be distributed under the terms of the Sections above. b) Give prominent notice with the combined library of the fact that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 8. You may not copy, modify, sublicense, link with, or distribute the Library except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense, link with, or distribute the Library is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 9. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Library or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Library (or any work based on the Library), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Library or works based on it. 10. Each time you redistribute the Library (or any work based on the Library), the recipient automatically receives a license from the original licensor to copy, distribute, link with or modify the Library subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties with this License. 11. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Library at all. For example, if a patent license would not permit royalty-free redistribution of the Library by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Library. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply, and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 12. If the distribution and/or use of the Library is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Library under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 13. The Free Software Foundation may publish revised and/or new versions of the Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Library does not specify a license version number, you may choose any version ever published by the Free Software Foundation. 14. If you wish to incorporate parts of the Library into other free programs whose distribution conditions are incompatible with these, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Libraries If you develop a new library, and you want it to be of the greatest possible use to the public, we recommend making it free software that everyone can redistribute and change. You can do so by permitting redistribution under these terms (or, alternatively, under the terms of the ordinary General Public License). To apply these terms, attach the following notices to the library. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA Also add information on how to contact you by electronic and paper mail. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the library, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the library `Frob' (a library for tweaking knobs) written by James Random Hacker. , 1 April 1990 Ty Coon, President of Vice That's all there is to it! ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/ChangeLog0000666000000000000000000013337714762603732015037 0ustar00rootrootChangeLog for logilab.common ============================ 2021-01-05 -- 1.8.1 * feature: add tox helpers to make pipy and debian releases * fix: use TypedDict if python version > 3.8 only, otherwise use a Dict (TypedDict were imported from typing_extension 3.7.4, which is not available on debian buster) 2020-11-22 -- 1.8.0 * deprecation: add subclass to DeprecationWarning with structured information (TargetRenamedDeprecationWarning, TargetDeprecatedDeprecationWarning, TargetRemovedDeprecationWarning, TargetMovedDeprecationWarning) * deprecation: add tests to ensure that DeprecationWarning target the correct line and the correct file * deprecation: add types annotations * declare that logilab.common ships type annotations (py.typed file) * various bug fixes 2020-09-03 -- 1.7.3 * type: declare that logilab-common ship type annotations * make the build reproducible * fix(deprecation): stacked decorators breaks getting the real callable __name__ attribute * fix: in some situation (using several deprecation functions), renaming deprecation utils failed to point to the correct new name and used random internal names of the module 2020-06-24 -- 1.7.2 * fix(deprecation): rollback to old class_deprecation being a class behavior 2020-06-11 -- 1.7.1 * fix: import error on re.Pattern with python < 3.7 2020-06-10 -- 1.7.0 * logilab-common requires python 3.6 now * greatly improve our CI and migrate it to heptapod/gitlab-ci * black the whole code base * move test suit to pytest * use check-manifest and fix related bugs in MANIFEST.in * integrates flake8 and please the flake8 gods * various fixes * class_deprecation is not a class anymore * pytest 5.4.2 breaks tests, pin to 5.4.1 for now 2020-05-25 -- 1.6.4 * fix: rollback to old class_deprecation being a class behavior * fix: @functools.wraps broke callable_renamed, write a @lazy_wraps and use it everywhere in logilab.common.deprecation * add docstring to LazyObject 2020-05-11 -- 1.6.3 * fix: metaclass conflict in class_deprecated 2020-05-11 -- 1.6.2 * fix: explicitly requires python 3.6 in setup.py 2020-05-01 -- 1.6.1 * bug fix, bad usage of callable_renamed 2020-04-30 -- 1.6.0 * logilab-common requires python >= 3.6 now * use pyannotates to introduces types in all the modules * introduce a list of new functions in logilab.common.deprecation: callable_renamed, attribute_renamed, argument_renamed, argument_remove * renamed "renamed" to "callable_renamed", "deprecated" to "callable_deprecated", "moved" to "callable_moved" for coherence * refactor the whole logilab.common.deprecation to simplify its code * automatically detect from which modules a deprecated utils is called * correctly display the line where a deprecated utils is used * various small fixes, thx mypy 2019-12-04 -- 1.5.2 * documentation is now available at https://logilab-common.readthedocs.io/ * drop python2 support, python >= 3.3 is the new required version * therefor, drop dependency on six * drop rpm packaging * registry: add a Registry.add_select_best_listener method to subscribe to the result of a _select_best of a Registry. * shellutils: deprecate 'input' as argument of RawInput in favor of 'input_function' 2016-10-03 -- 1.3.0 * pytest: executable deprecated and renamed as logilab-pytest to prevent conflict with pytest provided by http://pytest.org/ 2016-03-15 -- 1.2.0 * pytest: TraceController class, pause_tracing and resume_tracing functions, deprecated from 0.63.1, got removed. The nocoverage and pause_trace utilities are now available from the testlib module rather than pytest. * date: datetime2ticks uses the milliseconds from the datetime objects 2015-10-12 -- 1.1.0 * configuration: have a stable order for sections (#298658) * testlib: clean out deprecated TestCase methods (#1716063), move pytest specifics to pytest.py (#1716053) * fix a few python3 bugs in umessage, configuration and optik_ext modules * testlib: report failures and skips in generative tests properly * optik_ext: return bytes as ints and not floats (#2086835) 2015-07-08 -- 1.0.2 * declare setuptools requirement in __pkginfo__/setup.py * randomize order of test modules in pytest -t 2015-07-01 -- 1.0.1 * restore __pkginfo__.version, which pylint < 1.4.4 uses 2015-06-30 -- 1.0.0 * remove unused/deprecated modules: cli, contexts, corbautils, dbf, pyro_ext, xmlrpcutils. __pkginfo__ is no longer installed. * major layout change * use setuptools exclusively * 'logilab' is now a proper namespace package * modutils: basic support for namespace packages * registry: ambiguous selects now raise a specific exception * testlib: better support for non-pytest launchers * testlib: Tags() now work with py3k 2014-11-30 -- 0.63.2 * fix 2 minor regressions from 0.63.1 2014-11-28 -- 0.63.1 * fix fallout from py3k conversion * pytest: fix TestSuite.run wrapper (#280806) * daemon: change umask after creating pid file 2014-11-05 -- 0.63.0 * drop compatibility with python <= 2.5 (#264017) * fix textutils.py doctests for py3k * produce a clearer exception when dot is not installed (#253516) * make source python3-compatible (3.3+), without using 2to3. This introduces a dependency on six (#265740) * fix umessage header decoding on python 3.3 and newer (#149345) * WARNING: the compat module no longer exports 'callable', 'izip', 'imap', 'chain', 'sum', 'enumerate', 'frozenset', 'reversed', 'sorted', 'max', 'relpath', 'InheritableSet', or any subprocess-related names. 2014-07-30 -- 0.62.1 * shellutils: restore py 2.5 compat by removing usage of class decorator * pytest: drop broken --coverage option * testlib: support for skipping whole test class and conditional skip, don't run setUp for skipped tests * configuration: load options in config file order (#185648) 2014-03-07 -- 0.62.0 * modutils: cleanup_sys_modules returns the list of cleaned modules 2014-02-11 -- 0.61.0 * pdf_ext: removed, it had no known users (CVE-2014-1838) * shellutils: fix tempfile issue in Execute, and deprecate it (CVE-2014-1839) * pytest: use 'env' to run the python interpreter * graph: ensure output is ordered on node and graph ids (#202314) 2013-16-12 -- 0.60.1 * modutils: * don't propagate IOError when package's __init__.py file doesn't exist (#174606) * ensure file is closed, may cause pb depending on the interpreter, eg pypy) (#180876) * fix support for `extend_path` based nested namespace packages ; Report and patch by John Johnson (#177651) * fix some cases of failing python3 install on windows platform / cross compilation (#180836) 2013-07-26 -- 0.60.0 * configuration: rename option_name method into option_attrname (#140667) * deprecation: new DeprecationManager class (closes #108205) * modutils: - fix typo causing name error in python3 / bad message in python2 (#136037) - fix python3.3 crash in file_from_modpath due to implementation change of imp.find_module wrt builtin modules (#137244) * testlib: use assertCountEqual instead of assertSameElements/assertItemsEqual (deprecated), fixing crash with python 3.3 (#144526) * graph: use codecs.open avoid crash when writing utf-8 data under python3 (#155138) 2013-04-16 -- 0.59.1 * graph: added pruning of the recursive search tree for detecting cycles in graphs (closes #2469) * testlib: check for generators in with_tempdir (closes #117533) * registry: - select_or_none should not silent ObjectNotFound exception (closes #119819) - remove 2 accidentally introduced tabs breaking python 3 compat (closes #117580) * fix umessages test w/ python 3 and LC_ALL=C (closes #119967, report and patch by Ian Delaney) 2013-01-21 -- 0.59.0 * registry: - introduce RegistrableObject base class, mandatory to make classes automatically registrable, and cleanup code accordingly - introduce objid and objname methods on Registry instead of classid function and inlined code plus other refactorings to allow arbitrary objects to be registered, provided they inherit from new RegistrableInstance class (closes #98742) - deprecate usage of leading underscore to skip object registration, using __abstract__ explicitly is better and notion of registered object 'name' is now somewhat fuzzy - use register_all when no registration callback defined (closes #111011) * logging_ext: on windows, use colorama to display colored logs, if available (closes #107436) * packaging: remove references to ftp at logilab * deprecations: really check them * packaging: steal spec file from fedora (closes #113099) * packaging force python2.6 on rhel5 (closes #113099) * packaging Update download and project urls (closes #113099) * configuration: enhance merge_options function (closes #113458) * decorators: fix @monkeypatch decorator contract for dark corner cases such as monkeypatching of a callable instance: no more turned into an unbound method, which was broken in python 3 and probably not used anywhere (actually closes #104047). 2012-11-14 -- 0.58.3 * date: fix ustrftime() impl. for python3 (closes #82161, patch by Arfrever Frehtes Taifersar Arahesis) and encoding detection for python2 (closes #109740) * other python3 code and test fixes (closes #104047) * registry: Store.setdefault shouldn't raise RegistryNotFound (closes #111010) * table: stop encoding to iso-8859-1, use unicode (closes #105847) * setup: properly install additional files during build instead of install (closes #104045) 2012-07-30 -- 0.58.2 * modutils: fixes (closes #100757 and #100935) 2012-07-17 -- 0.58.1 * modutils, testlib: be more python implementation independant (closes #99493 and #99627) 2012-04-12 -- 0.58.0 * new `registry` module containing a backport of CubicWeb selectable objects registry (closes #84654) * testlib: DocTestCase fix builtins pollution after doctest execution. * shellutil: add argument to ``ProgressBar.update`` to tune cursor progression (closes #88981) * deprecated: new DeprecationWrapper class (closes #88942) 2012-03-22 -- 0.57.2 * texutils: apply_units raise ValueError if string isn'nt valid (closes #88808) * daemon: don't call putenv directly * pytest: do not enable extra warning other than DeprecationWarning. * testlib: DocTestCase fix builtins pollution after doctest execution. * testlib: replace sys.exit with raise ImportError (closes: #84159) * fix license in README * add trove classifiers (tell about python 3 support for pypi) 2011-10-28 -- 0.57.1 * daemon: change $HOME after dropping privileges (closes #81297) * compat: method_type for py3k use instance of the class to have a real instance method (closes: #79268) 2011-10-12 -- 0.57.0 * only install unittest2 when python version < 2.7 (closes: #76068) * daemon: make pidfile world-readable (closes #75968) * daemon: remove unused(?) DaemonMixin class * update compat module for callable() and method_type() * decorators: fix monkeypatch py3k compat (closes #75290) * decorators: provide a @cachedproperty decorator 2011-09-08 -- 0.56.2 * daemon: call initgroups/setgid before setuid (closes #74173) * decorators: @monkeypatch should produce a method object (closes #73920) * modutils: allow overriding of _getobj by suppressing mangling 2011-08-05 -- 0.56.1 * clcommands: #72450 --rc-file option doesn't work 2011-06-09 -- 0.56.0 * clcommands: make registration possible by class decoration * date: new datetime/delta <-> seconds/days conversion function * decorators: refactored @cached to allow usages such as @cached(cacheattr='_cachename') while keeping bw compat 2011-04-01 -- 0.55.2 * new function for password generation in shellutils * pyro_ext: allow to create a server without registering with a pyrons 2011-03-28 -- 0.55.1 * fix date.ustrftime break if year <= 1900 * fix graph.py incorrectly builds command lines using %s to call dot * new functions to get UTC datetime / time 2011-02-18 -- 0.55.0 * new urllib2ext module providing a GSSAPI authentication handler, based on python-kerberos * graph: test and fix ordered_nodes() [closes #60288] * changelog: refactor ChangeLog class to ease overriding * testlib: Fix tag handling for generator. 2011-01-12 -- 0.54.0 * dropped python 2.3 support * daemon: we can now specify umask to daemonize function, and it return different exit code according to the process * pyro_ext: new ns_reregister function to ensure a name is still properly registered in the pyro name server * hg: new incoming/outgoing functions backward compatible with regards to mercurial version (eg hg 1.6 and earlier) * testlib/pytest: more deprecation and removed code. Still on the way to unittest2 2010-11-15 -- 0.53.0 * first python3.x compatible release * __init__: tempattr context manager * shellutils: progress context manager 2010-10-11 -- 0.52.1 * configuration: fix pb with option names as unicode string w/ python 2.5. Makes OptionError available through the module * textutils: text_to_dict skip comments (# lines) * compat: dropped some 2.2 compat * modutils: Consider arch-specific installation for STD_LIB_DIR definition 2010-09-28 -- 0.52.0 * testlib is now based on unittest2, to prepare its own extinction. Warning are printed so you can easily migrate step by step. * restored python 2.3 compat in some modules, so one get a change to run pylint at least * textutils: use NFKD decomposition in unormalize() * logging_ext: don't try to use ansi colorized formatter when not in debug mode 2010-09-10 -- 0.51.1 * logging_ext: init_log function splitted into smaller chunk to ease reuse in other contexts * clcommands: enhanced/cleaned api, nicer usage display * various pylint detected errors fixed 2010-08-26 -- 0.51.0 * testlib: don't raise string exception (closes #35331) * hg: new module regrouping some mercurial utility functions * clcommands: refactored to get more object oriented api. * optparser: module is now deprecated, use clcommands instead * textutils: new split_url_or_path and text_to_dict functions * logging_ext: - init_log now accept optionaly any arbitrary handler - threshold default to DEBUG if debug flag is true and no threshold specified * date: new ustrftime implementation working around datetime limitaion on dates < 1900 2010-06-04 -- 0.50.3 * logging: added new optional kw argument to init_log rotating_parameters * date: fix nb_open_days() codomain, positive natural numbers are expected * configuration: - skip option with no type, avoid pb with generated option such as long-help - handle level on man page generation 2010-05-21 -- 0.50.2 * fix licensing information: LGPL v2.1 or greater * daemon: new daemonize function * modutils: fix some false negative of is_standard_module with 'from module import something" where something isn't a submodule * optik_ext: fix help generation for normal optparse using script if optik_ext has been imported (#24450) * textutils support 256 colors when available * testlib] add option splitlines to assertTextEquals 2010-04-26 -- 0.50.1 * implements __repr__ on nullobject * configuration: avoid crash by skipping option without 'type' entry while input a config * pyro_ext: raise PyroError instead of exception 2010-04-20 -- 0.50.0 * graph: - generate methods now takes an optional mapfile argument to generate html image maps - new ordered_nodes function taking a dependency graph dict as arguments and returning an ordered list of nodes * configuration: - nicer serialization of bytes / time option - may now contains several option provider with the same name - consider 'level' in option dict, --help displaying only option with level 0, and automatically adding --long-help options for higher levels * textutils: case insensitive apply_unit * sphinx_ext: new module usable as a sphinx pluggin and containing a new 'autodocstring' directive * ureports: output   instead of   for strict xhtml compliance * decorators: @cached propery copy inner function docstring 2010-03-16 -- 0.49.0 * date: new 'totime' function * adbh, db, sqlgen modules moved to the new logilab-database package * pytest: when -x option is given, stop on the first error even if there are multiple test directories 2010-02-26 -- 0.48.1 * adbh: added dbport optional argument to [backup|restore]_commands * db: fix date processing for SQLServer 2005 * testlib: improve XML assertion by using ElementTree parser and a new 'context' lines argument 2010-02-17 -- 0.48.0 * date: fixed mx date time compat for date_range (#20651) * testlib: generative test should not be interrupted by self.skip() (#20648) 2010-02-10 -- 0.47.0 * adbh: changed backup / restore api (BREAKS COMPAT): - backup_command is now backup_commands (eg return a list of commands) - each command returned in backup_commands/restore_commands may now be list that may be used as argument to subprocess.call, or a string which will the requires a subshell - new sql_rename_col method * deprecation: deprecated now takes an optional 'stacklevel' argument, default to 2 * date: some functions to ease python's datetime module usage have been backported from cubicweb 2009-12-23 -- 0.46.0 * db / adbh: added SQL Server support using Pyodbc * db: - New optional extra_args argument to get_connection. - Support Windows Auth for SQLServer by giving extra_args='Trusted_Connection' to the sqlserver2005 driver 2009-11-23 -- 0.45.2 * configuration: - proper bytes and time option types support - make Method usable as 'callback' value - fix #8849 Using plugins, options and .pylintrc crashes PyLint * graph: fix has_path returned value to include the destination node, else we get an empty list which makes think there is no path (test added) 2009-08-26 -- 0.45.0 * added function for parsing XML processing instructions 2009-08-07 -- 0.44.0 * remove code deprecated for a while now * shellutils: replace confirm function by RawInput class /ASK singleton * deprecation: new deprecated decorator, replacing both obsolete and deprecated_function 2009-07-21 -- 0.43.0 * dbf: a DBF reader which reads Visual Fox Pro DBF format with Memo field (module from Yusdi Santoso) * shellutils: - #9764 add title to shellutils.ProgressBar - #9796 new confirm function * testlib: - simplify traceback manipulation (skip first frames corresponding to testlib functions) - -c now captures DeprecationWarnings * sphinxutils: simplified API * modutils: new cleanup_sys_modules function that removes modules under a list of directories from sys.modules 2009-07-17 -- 0.42.0 * pyro_ext: new module for pyro utilities * adbh: fix default set_null_allowed implementation, new case_sensitive resource descriptor 2009-06-03 -- 0.41.0 * modutils: new extrapath argument to modpath_from_file (see function's docstring for explanation) * adbh: new alter_column_support flag, sql_set_null_allowed and sql_change_col_type methods 2009-05-28 -- 0.40.1 * date: handle both mx.DateTime and datetime representations * db: use sqlite native module's Binary, not StringIO 2009-05-14 -- 0.40.0 * python < 2.3 are now officially unsupported * #9162: new module with some sphinx utilities * #9166: use a global variable to control mx datetime / py datetime usage * db: add time adapter for pysqlite2, fix mysql bool and string handling * configuration: don't print default for store_true / store_false option or option with None as default 2009-04-07 -- 0.39.1 * fix #6760 umessage.decode_QP() crashes on unknown encoding 2009-03-25 -- 0.39.0 * fix #7915 (shellutils unusable under windows) * testlib: * new profile option using cProfile * allows to skip a module by raising TestSkipped from module import * modutils: locate modules in zip/egg archive * db: USE_MX_DATETIME global to control usage of mx.DateTime / py datetime 2009-01-26 -- 0.38.0 * setuptools / easy_install support! * removed some old backward compat code * adbh: new intersect_all_support attribute * contexts: new pushd context manager * shellutils: enhance acquire_lock method w/ race condition * configuration: fix case sensitivity pb w/ config file sections * pytest: reimplemented colorization 2009-01-08 -- 0.37.2 * configuration: encoding handling for configuration file generation * adbh: fix Datetime type map for mysql * logging_ext: drop lldebug level which shouldn't be there 2008-12-11 -- 0.37.1 * contexts: make the module syntactically correct wrt python2.4 2008-12-09 -- 0.37.0 * contexts: new module for context managers, keeping py <2.4 syntax compat for distribution (only `tempdir` cm for now) * tasksqueue: new module containing a class to handle prioritized tasks queue * proc: new module for process information / resource control * optik_ext: new time/bytes option types, using textutils conversion function * logging_ext: new set_log_methods / init_log utility functions 2008-10-30 -- 0.36.0 * configuration: - option yn is now behaving like a flag (i.e --ex : if ex.default=True and --ex in sys.args then ex.value=False) - new attribute hide in option (i.e --ex : if --ex has 'hide':True then the option will not be displayed in man or --help) * pytest: - add colors in display - new option --restart that skips tests that succeeded on last run * cache: new herits from dict class * decorators: add @require_version @require_module that skip test if decorators are not satisfied 2008-10-09 -- 0.35.3 * graph: new has_path method 2008-10-01 -- 0.35.2 * configuration: - fix #6011: lgc.configuration ignore customized option values - fix #3278: man page generation broken * dropped context.py module which broke the debian package when some python <2.5 is installed (#5979) 2008-09-10 -- 0.35.0 * fix #5945: wrong edge properties in graph.DotBackend * testlib: filter tests with tag decorator * shellutils: new simple unzip function 2008-08-07 -- 0.34.0 * changelog: properly adds new line at the end of each entry * testlib: add a with_tempdir decorator ensuring all temporary files and dirs are removed * graph: improve DotBackend configuration. graphiz rendered can now be selected and additional graph parameter used * db: support of Decimal Type 2008-06-25 -- 0.33.0 * decorators: new @locked decorator * cache: make it thread safe, changed behaviour so that when cache size is 0 and __delitem__ is called, a KeyError is raised (more consistent) * testlib: - added assertIsNot, assertNone and assertNotNone assertion - added assertUnorderedIterableEquals - added assertDirEquals - various failure output improvement * umessage: umessage.date() may return unparsable string as is instead of None * compat: adds a max function taking 'key' as keyword argument as in 2.5 * configuration: escape rest when printing for default value 2008-06-08 -- 0.32.0 * textutils: add the apply_unit function * testlib: - added a assertXMLEqualsTuple test assertion - added a assertIs assertion 2008-05-08 -- 0.31.0 * improved documentation and error messages * testlib: support a msg argument on more assertions, pysqlite2 as default * pytest: pytestconf.py for customization 2008-03-26 -- 0.30.0 * db: remember logged user on the connection * clcommands: commands may be hidden (e.g. not displayed in help), generic ListCommandsCommand useful to build bash completion helpers * changelog: module to parse ChangeLog file as this one, backported from logilab.devtools 2008-03-12 -- 0.29.1 * date: new nb_open_days function counting worked days between two date * adbh: add -p option to mysql commands to ask for password 2008-03-05 -- 0.29.0 * adbh: mysql doesn't support ILIKE, implement list_indices for mysql * db: mysql adapter use mx DateTime when available, fix unicode handling 2008-02-18 -- 0.28.2 * testlib: restore python2.3 compatibility 2008-02-15 -- 0.28.1 * testlib: introduce InnerTest class to name generative tests, fix generative tests description storage * pytest: fix -s option * modutils: included Stefan Rank's patch to deal with 2.4 relative import * configuration: don't give option's keywords not recognized by optparse, fix merge_options function 2008-02-05 -- 0.28.0 * date: new `add_days_worked` function * shellutils: new `chown` function * testlib: new `strict` argument to assertIsInstance * __init__: new `attrdict` and `nullobject` classes 2008-01-25 -- 0.27.0 * deprecation: new class_moved utility function * interface: fix subinterface handling 2008-01-10 -- 0.26.1 * optparser: support --version at main command level * testlib: added man page for pytest * textutils: fix a bug in normalize{_,_rest_}paragraph which may cause infinite loop if an indent string containing some spaces is given 2008-01-07 -- 0.26.0 * db: binarywrap support * modutils: new LazyObject class 2007-12-20 -- 0.25.2 * adbh: new needs_from_clause variable on db helper 2007-12-11 -- 0.25.1 * pytest: new --profile option, setup module / teardown module hook, other fixes and enhancements * db: mysql support fixes * adbh: fix postgres list_indices implementation 2007-11-26 -- 0.25.0 * adbh: - list_tables implementation for sqlite - new list_indices, create_index, drop_index methods * restore python < 2.4 compat 2007-10-29 -- 0.24.0 * decorators: new classproperty decorator * adbh: new module containing advanced db helper which were in the "db" module, with additional registered procedures handling 2007-10-23 -- 0.23.1 * modutils: fix load_module_from_* (even with use_sys=False, it should try to get outer packages from sys.modules) 2007-10-17 -- 0.23.0 * db: - mark support_users and support_groups methods as obsolete in favor of users_support and groups_support attributes - new ilike_support property on dbms helpers - extended db helper api - completed mysql support * textutils: new unormalize function to normalize diacritical chars by their ascii equivalent * modutils: new load_module_from_file shortcut function * clcommands: pop_args accept None as value for expected_size_after, meaning remaining args should not be checked * interface: new extend function to dynamically add an implemented interface to a new style class 2007-06-25 -- 0.22.2 * new 'typechanged' action for configuration.read_old_config 2007-05-14 -- 0.22.1 * important bug fix in db.py * added history in pytest debugger sessions * fix pytest coverage bug * fix textutils test * fix a bug which provoked a crash if devtools was not installed 2007-05-14 -- 0.22.0 * pytest improvements * shellutils: use shutil.move instead of os.rename as default action of mv * db: new `list_users` and `sql_drop_unique_constraint` methods on advanced helpers * deprecation: new `obsolete` decorator 2007-02-12 -- 0.21.3 * fixed cached decorator to use __dict__ instead of attribute lookup, avoiding potential bugs with inheritance when using cached class methods 2007-02-05 -- 0.21.2 * fix ReST normalization (#3471) 2006-12-19 -- 0.21.1 * tree: make Node iterable (iter on its children) * configuration: fix #3197 (OptionsManagerMixin __init__ isn't passing correctly its "version" argument) * textutils: new 'rest' argument to normalize_text to better deal with ReST formated text * some packaging fixes 2006-11-14 -- 0.21.0 * db: - new optional keepownership argument to backup|restore_database methods - only register mxDatetime converters on psycopg2 adapter if mx.DateTime is available * moved some stuff which was in common __init__ file into specific module. At this occasion new "decorators" and "deprecation" modules has been added * deprecated fileutils.[files_by_ext,include_files_by_ext,exclude_files_by_ext] functions in favor of new function shellutils.find * mark the following modules for deprecation, they will be removed in a near version: * astutils: moved to astng * bind (never been used) * html: deprecated * logger/logservice: use logging module * monclient/monserver (not used anymore) * patricia (never been used) * twisted_distutils (not used anymore) * removed the following functions/methods which have been deprecated for a while now: * modutils.load_module_from_parts * textutils.searchall * tree.Node.leafs * fileutils.get_by_ext, filetutils.get_mode, fileutils.ensure_mode * umessage: more robust charset handling 2006-11-03 -- 0.20.2 * fileutils: new remove_dead_links function * date: add missing strptime import 2006-11-01 -- 0.20.1 * umessage: - new message_from_string function - fixed get_payload encoding bug * db: default postgres module is now psycopg2, which has been customized to return mx.Datetime objects for date/time related types 2006-10-27 -- 0.20.0 * db: - fixed date handling - new methods on advanced helper to generate backup commands * configuration: basic deprecated config handling support * new implementation of pytest * backport a dot backend from yams into a new "graph" module 2006-10-03 -- 0.19.3 * fixed bug in textutils.normalise_[text|paragraph] with unsplitable word larger than the maximum line size * added pytest.bat for windows installation * changed configuration.generate_config to include None values into the generated file 2006-09-25 -- 0.19.2 * testlib: - fixed a bug in find_test making it returns some bad test names - new assertIsInstance method on TestCase * optik_ext: make it works if mx.DateTime is not installed, in which case the date type option won't be available * test fixes 2006-09-22 -- 0.19.1 * db: - fixed bug when querying boolean on sqlite using python's bool type - fixed time handling and added an adapter for DateTimeDeltaType - added "drop_on_commit" argument to create_temporary_table on db helper - added missing implementation of executemany on pysqlite2 wrapper to support pyargs correctly like execute * optik_ext: fixed "named" type option to support csv values and to return a dictionary 2006-09-05 -- 0.19.0 * new umessage module which provides a class similar to the standard email.Message class but returning unicode strings * new clcommands module to handle commands based command line tool (based on the configuration module) * new "date" option type in optik_ext * new AttrObject in testlib to create objects in test with arbitrary attributes * add pytest to run project's tests and get rid of all runtests.py * add pytest option to enable design-by-contract using aspects * some enhancements to the configuration module 2006-08-09 -- 0.18.0 * added -c / --capture option to testlib.unittest_main * fixed bugs in lgc.configuration * optparser: added a OptionParser that extends optparse's with commands 2006-07-13 -- 0.17.0 * python2.5 compatibility (testlib.py + compat.py) * testlib.assertListEquals return all errors at once * new "password" option type in optik_ext * configuration: refactored to support interactive input of a configuration 2006-06-08 -- 0.16.1 * testlib: improved test collections * compat: added cmp argument to sorted 2006-05-19 -- 0.16.0 * testlib: - added a set of command line options (PYDEBUG is deprecated, use the -i/--pdb option, and added -x/--exitfirst option) - added support for generative tests * db: - fix get_connection parameter order and host/port handling - added .sql_temporary_table method to advanced func helpers - started a psycopg2 adapter * configuration: enhanced to handle default value in help and man pages generation (require python >= 2.4) 2006-04-25 -- 0.15.1 * db: add missing port handling to get_connection function and dbapimodule.connect methods * testlib: various fixes and minor improvements 2006-03-28 -- 0.15.0 * added "cached" decorator and a simple text progression bar into __init__ * added a simple text progress bar into __init__ * configuration: fixed man page generation when using python 2.4 * db: added pysqllite2 support, preconfigured to handle timestamp using mxDatetime and to correctly handle boolean types 2006-03-06 -- 0.14.1 * backported file support and add LOG_CRIT to builtin in logservice module 2006-02-28 -- 0.14.0 * renamed assertXML*Valid to assertXML*WellFormed and deprecated the old name * fixed modutils.load_module_from_* 2006-02-03 -- 0.13.1 * fix some tests, patch contributed by Marien Zwart * added ability to log into a file with make_logger() 2006-01-06 -- 0.13.0 * testlib: ability to skip a test * configuration: - cleaner configuration file generation - refactoring so that we can have more control on file configuration loading using read_config_file and load_config_file instead of load_file_configuration * modutils: fix is_relative to return False when from_file is a file located somewhere in sys.path * ureport: new "escaped" attribute on Text nodes, controling html escaping * compat: make set iterable and support more other set operations... * removed the astng sub-package, since it's now self-distributed as logilab-astng 2005-09-06 -- 0.12.0 * shellutils: bug fix in mv() * compat: - use set when available - added sorted and reversed * table: new methods and some optimizations * tree: added some deprecation warnings 2005-07-25 -- 0.11.0 * db: refactoring, added sqlite support, new helpers to support DBMS specific features 2005-07-07 -- 0.10.1 * configuration: added basic man page generation feature * ureports: unicode handling, some minor fixes * testlib: enhance MockConnection * python2.2 related fixes in configuration and astng 2005-05-04 -- 0.10.0 * astng: improve unit tests coverage * astng.astng: fix Function.format_args, new method Function.default_value, bug fix in Node.resolve * astng.builder: handle classmethod and staticmethod as decorator, handle data descriptors when building from living objects * ureports: - new docbook formatter - handle ReST like urls in the text writer - new build_summary utility function 2005-04-14 -- 0.9.3 * optik_ext: add man page generation based on optik/optparse options definition * modutils: new arguments to get_source_file to handle files without extensions * astng: fix problem with the manager and python 2.2 (optik related) 2005-02-16 -- 0.9.2 * textutils: - added epydoc documentation - new sep argument to the get_csv function - fix pb with normalize_* functions on windows platforms * fileutils: - added epydoc documentation - fixed bug in get_by_ext (renamed files_by_ext) with the exclude_dirs argument * configuration: - fixed a bug in configuration file generation on windows platforms - better test coverage * fixed testlib.DocTest which wasn't working anymore with recent versions of pyunit * added "context_file" argument to file_from_modpath to avoid possible relative import problems * astng: use the new context_file argument from Node.resolve() 2005-02-04 -- 0.9.1 * astng: - remove buggy print - fixed builder to deal with builtin methods - fixed raw_building.build_function with python 2.4 * modutils: code cleanup, some reimplementation based on "imp", better handling of windows specific extensions, epydoc documentation * fileutils: new exclude_dirs argument to the get_by_ext function * testlib: main() support -p option to run test in a profiled mode * generated documentation for modutils in the doc/ subdirectory 2005-01-20 -- 0.9.0 * astng: - refactoring of some huge methods - fix interface resolving when __implements__ is defined in a parent class in another module - add special code in the builder to fix problem with qt - new source_line method on Node - fix sys.path during parsing to avoid some failure when trying to get imported names by `from module import *`, and use an astng building instead of exec'ing the statement - fix possible AttributeError with Function.type - manager.astng_from_file fallback to astng_from_module if possible * textutils: fix bug in normalize_paragraph, unquote handle empty string correctly * modutils: - use a cache in has_module to speed up things when heavily used - fix file_from_modpath to handle pyxml and os.path * configuration: fix problem with serialization/deserialization of empty string 2005-01-04 -- 0.8.0 * modutils: a lot of fixes/rewrite on various functions to avoid unnecessary imports, sys.path pollution, and other bugs (notably making pylint reporting wrong modules name/path) * astng: new "inspector" module, initially taken from pyreverse code (http://www.logilab.org/projects/pyreverse), miscellaneous bug fixes * configuration: new 'usage' parameter on the Configuration initializer * logger: unicode support * fileutils: get_by_ext also ignore ".svn" directories, not only "CVS" 2004-11-03 -- 0.7.1 * astng: - don't raise a syntax error on files missing a trailing \n. - fix utils.is_abstract (was causing an unexpected exception if a string exception was raised). - fix utils.get_implemented. - fix file based manager's cache problem. * textutils: fixed normalize_text / normalize_paragraph functions 2004-10-11 -- 0.7.0 * astng: new methods on the manager, returning astng with nodes for packages (i.e. recursive structure instead of the flat one), with automatic lazy loading + introduction of a dict like interface to manipulate those nodes and Module, Class and Function nodes. * logservice: module imported from the ginco project * configuration: added new classes Configuration and OptionsManager2Configuration adapter, fix bug in loading options from file * optik_ext/configuration: some new option type "multiple_choice" * fileutils: new ensure_mode function * compat: support for sum and enumerate 2004-09-23 -- 0.6.0 * db: added DBAPIAdapter * textutils: fix in pretty_match causing malformated messages in pylint added ansi colorization management * modutils: new functions get_module_files, has_module and file_from_modpath * astng: some new utility functions taken from pylint, minor changes to the manager API, Node.resolve doesn't support anymore "living" resolution, some new methods on astng nodes * compat: new module for a transparent compatibility layer between different python version (actually 2.2 vs 2.3 for now) 2004-07-08 -- 0.5.2 * astng: fix another bug in klassnode.ancestors() method... * db: fix mysql access * cli: added a space after the prompt 2004-06-04 -- 0.5.1 * astng: fix undefined var bug in klassnode.ancestors() method * ureports: fix attributes on title layout * packaging:fix the setup.py script to allow bdist_winst (well, the generated installer has not been tested...) with the necessary logilab/__init__.py file 2004-05-10 -- 0.5.0 * ureports: new Universal Reports sub-package * xmlrpcutils: new xmlrpc utilities module * astng: resolve(name) now handle (at least try) builtins * astng: fixed Class.as_string (empty parent when no base classes) * astng.builder: knows a little about method descriptors, Function with unknown arguments have argnames==None. * fileutils: new is_binary(filename) function * textutils: fixed some Windows bug * tree: base not doesn't have the "title" attribute anymore * testlib: removed the spawn function (who used that ?!), added MockSMTP, MockConfigParser, MockConnexion and DocTestCase (test class for modules embedding doctest). All mocks objects are very basic and will be enhanced as the need comes. * testlib: added a TestCase class with some additional methods then the regular unittest.TestCase class * cli: allow specifying a command prefix by a class attributes,more robust, print available commands on help * db: new "binary" function to get the binary wrapper for a given driver, and new "system_database" function returning the system database name for different DBMS. * configuration: better group control 2004-02-20 -- 0.4.5 * db: it's now possible to fix the modules search order. By default call set_isolation_level if psycopg is used 2004-02-17 -- 0.4.4 * modutils: special case for os.path in get_module_part * astng: handle special case where we are on a package node importing a module using the same name as the package, which may end in an infinite loop on relative imports in Node.resolve * fileutils: new get_by_ext function 2004-02-11 -- 0.4.3 * astng: refactoring of Class.ancestor_for_* methods (now depends on python 2.2 generators) * astng: make it more robust * configuration: more explicit exception when a bad option is provided * configuration: define a short version of an option using the "short" keyword, taking a single letter as value * configuration: new method global_set_option on the manager * testlib : allow no "suite" nor "Run" function in test modules * shellutils: fix bug in *mv* 2003-12-23 -- 0.4.2 * added Project class and some new methods to the ASTNGManger * some new functions in astng.utils * fixed bugs in some as_string methods * fixed bug in textutils.get_csv * fileutils.lines now take a "comments" argument, allowing to ignore comment lines 2003-11-24 -- 0.4.1 * added missing as_string methods on astng nodes * bug fixes on Node.resolve * minor fixes in textutils and fileutils * better test coverage (need more !) 2003-11-13 -- 0.4.0 * new textutils and shellutils modules * full astng rewrite, now based on the compiler.ast package from the standard library * added next_sbling and previous_sibling methods to Node * fix get_cycles 2003-10-14 -- 0.3.5 * fixed null size cache bug * added 'sort_by_column*' methods for tables 2003-10-08 -- 0.3.4 * fix bug in asntg, occurring with python2.3 and modules including an encoding declaration * fix bug in astutils.get_rhs_consumed_names, occurring in lists comprehension * remove debug print statement from configuration.py which caused a generation of incorrect configuration files. 2003-10-01 -- 0.3.3 * fix bug in modutils.modpath_from_file * new module corbautils 2003-09-18 -- 0.3.2 * fix bug in modutils.load_module_from_parts * add missing __future__ imports 2003-09-18 -- 0.3.1 * change implementation of modutils.load_module_from_name (use find_module and load_module instead of __import__) * more bug fixes in astng * new functions in fileutils (lines, export) and __init__ (Execute) 2003-09-12 -- 0.3 * expect "def suite" or "def Run(runner=None)" on unittest module * fixes in modutils * major fixes in astng * new fileutils and astutils modules * enhancement of the configuration module * new option type "named" in optik_the ext module 2003-06-18 -- 0.2.2 * astng bug fixes 2003-06-04 -- 0.2.1 * bug fixes * fix packaging problem 2003-06-02 -- 0.2.0 * add the interface, modutils, optik_ext and configuration modules * add the astng sub-package * miscellaneous fixes 2003-04-17 -- 0.1.2 * add the stringio module * minor fixes 2003-02-28 -- 0.1.1 * fix bug in tree.py * new file distutils_twisted 2003-02-17 -- 0.1.0 * initial revision ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/MANIFEST.in0000666000000000000000000000116714762603732015012 0ustar00rootrootinclude ChangeLog include README* include COPYING include COPYING.LESSER include bin/logilab-pytest include bin/logilab-pytest.bat include test/data/ChangeLog include tox.ini include *.txt include logilab/common/py.typed recursive-include logilab *.py recursive-include test *.py *.txt *.msg *.ini *.zip *.egg recursive-include test/data/*_dir * recursive-include test/input *.py recursive-include doc/html * include docs/* include __pkginfo__.py prune debian exclude .hg-format-source prune docs/_build exclude .hgrc exclude .gitlab-ci.yml exclude .yamllint exclude .cube-doctor.yml exclude .readthedocs.yaml exclude CHANGELOG.md ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.9301832 logilab_common-2.1.0/PKG-INFO0000644000000000000000000000607714762603767014362 0ustar00rootrootMetadata-Version: 2.2 Name: logilab-common Version: 2.1.0 Summary: collection of low-level Python packages and modules used by Logilab projects Home-page: https://forge.extranet.logilab.fr/open-source/logilab-common Author: Logilab Author-email: contact@logilab.fr License: LGPL Classifier: Topic :: Utilities Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3 :: Only Requires-Python: >=3.6 License-File: COPYING License-File: COPYING.LESSER Requires-Dist: setuptools Requires-Dist: mypy-extensions Requires-Dist: typing_extensions Requires-Dist: importlib_metadata<7,>=6; python_version < "3.10" Dynamic: author Dynamic: author-email Dynamic: classifier Dynamic: description Dynamic: home-page Dynamic: license Dynamic: requires-dist Dynamic: requires-python Dynamic: summary Logilab's common library ======================== What's this ? ------------- This package contains some modules used by different Logilab projects. It is released under the GNU Lesser General Public License. There is no documentation available yet but the source code should be clean and well documented. Designed to ease: * handling command line options and configuration files * writing interactive command line tools * manipulation of files and character strings * manipulation of common structures such as graph, tree, and pattern such as visitor * generating text and HTML reports * more... Documentation ------------- Documentation is available at https://logilab-common.readthedocs.io/ Installation ------------ logilab-common is available on pypi so you can install it using pip :: pip install logilab-common Or alternatively extract the tarball, jump into the created directory and run :: python setup.py install For installation options, see :: python setup.py install --help Building the documentation -------------------------- Create a virtualenv and install dependencies :: virtualenv venv source venv/bin/activate # you need the krb5-config command to build all dependencies # on debian you can get it using "apt-get install libkrb5-dev" pip install doc/requirements-doc.txt # install logilab-common pip install -e . Then build the doc :: cd doc make html It's now available under `doc/_build/html/` Code style ---------- The python code is verified against *flake8* and formatted with *black*. * You can run `tox -e black` to check that the files are well formatted. * You can run `tox -e black-run` to format them if needed. * You can include the `.hgrc` to your own `.hgrc` to automatically run black before each commit/amend. This can be done by writing `%include ../.hgrc` at the end of your `.hgrc`. Comments, support, bug reports ------------------------------ Project page https://www.logilab.org/project/logilab-common Use the cubicweb-devel at lists.cubicweb.org mailing list. You can subscribe to this mailing list at https://lists.cubicweb.org/mailman/listinfo/cubicweb-devel Archives are available at https://lists.cubicweb.org/pipermail/cubicweb-devel/ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/README.rst0000666000000000000000000000433414762603732014742 0ustar00rootrootLogilab's common library ======================== What's this ? ------------- This package contains some modules used by different Logilab projects. It is released under the GNU Lesser General Public License. There is no documentation available yet but the source code should be clean and well documented. Designed to ease: * handling command line options and configuration files * writing interactive command line tools * manipulation of files and character strings * manipulation of common structures such as graph, tree, and pattern such as visitor * generating text and HTML reports * more... Documentation ------------- Documentation is available at https://logilab-common.readthedocs.io/ Installation ------------ logilab-common is available on pypi so you can install it using pip :: pip install logilab-common Or alternatively extract the tarball, jump into the created directory and run :: python setup.py install For installation options, see :: python setup.py install --help Building the documentation -------------------------- Create a virtualenv and install dependencies :: virtualenv venv source venv/bin/activate # you need the krb5-config command to build all dependencies # on debian you can get it using "apt-get install libkrb5-dev" pip install doc/requirements-doc.txt # install logilab-common pip install -e . Then build the doc :: cd doc make html It's now available under `doc/_build/html/` Code style ---------- The python code is verified against *flake8* and formatted with *black*. * You can run `tox -e black` to check that the files are well formatted. * You can run `tox -e black-run` to format them if needed. * You can include the `.hgrc` to your own `.hgrc` to automatically run black before each commit/amend. This can be done by writing `%include ../.hgrc` at the end of your `.hgrc`. Comments, support, bug reports ------------------------------ Project page https://www.logilab.org/project/logilab-common Use the cubicweb-devel at lists.cubicweb.org mailing list. You can subscribe to this mailing list at https://lists.cubicweb.org/mailman/listinfo/cubicweb-devel Archives are available at https://lists.cubicweb.org/pipermail/cubicweb-devel/ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/__pkginfo__.py0000666000000000000000000000356714762603732016065 0ustar00rootroot# copyright 2003-2014 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it # under the terms of the GNU Lesser General Public License as published by the # Free Software Foundation, either version 2.1 of the License, or (at your # option) any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with logilab-common. If not, see . """logilab.common packaging information""" __docformat__ = "restructuredtext en" import os from os.path import join distname = "logilab-common" modname = "common" subpackage_of = "logilab" subpackage_master = True numversion = (2, 1, 0) version = ".".join([str(num) for num in numversion]) license = "LGPL" # 2.1 or later description = "collection of low-level Python packages and modules" " used by Logilab projects" web = "https://forge.extranet.logilab.fr/open-source/logilab-common" author = "Logilab" author_email = "contact@logilab.fr" scripts = [join("bin", "logilab-pytest")] include_dirs = [join("test", "data")] install_requires = [ "setuptools", "mypy-extensions", "typing_extensions", 'importlib_metadata>=6,<7; python_version < "3.10"', ] tests_require = [ "pytz", "egenix-mx-base", ] if os.name == "nt": install_requires.append("colorama") classifiers = [ "Topic :: Utilities", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3 :: Only", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/announce.txt0000666000000000000000000000044414762603732015620 0ustar00rootrootI'm pleased to announce the %(VERSION)s release of %(DISTNAME)s. What's new ? ------------ %(CHANGELOG)s What is %(DISTNAME)s ? ------------------------ %(LONG_DESC)s Home page --------- %(WEB)s Download -------- %(FTP)s Mailing list ------------ %(MAILINGLIST)s %(ADDITIONAL_DESCR)s ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.8621821 logilab_common-2.1.0/bin/0000755000000000000000000000000014762603767014023 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/bin/logilab-pytest0000777000000000000000000000021214762603732016677 0ustar00rootroot#!/usr/bin/env python3 import warnings warnings.simplefilter('default', DeprecationWarning) from logilab.common.pytest import run run() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/bin/logilab-pytest.bat0000666000000000000000000000053714762603732017453 0ustar00rootroot@echo off rem = """-*-Python-*- script rem -------------------- DOS section -------------------- rem You could set PYTHONPATH or TK environment variables here python -x "%~f0" %* goto exit """ # -------------------- Python section -------------------- from logilab.common.pytest import run run() DosExitLabel = """ :exit rem """ ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.866182 logilab_common-2.1.0/docs/0000755000000000000000000000000014762603767014203 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/Makefile0000666000000000000000000000110414762603732015633 0ustar00rootroot# Minimal makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build SOURCEDIR = . BUILDDIR = _build # Put it first so that "make" without argument is like "make help". help: @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) .PHONY: help Makefile # Catch-all target: route all unknown targets to Sphinx using the new # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). %: Makefile @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/changelog.rst0000666000000000000000000000005614762603732016661 0ustar00rootroot.. _changelog_doc: .. include:: ../ChangeLog ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/conf.py0000666000000000000000000001276514762603732015511 0ustar00rootroot# -*- coding: utf-8 -*- # # Configuration file for the Sphinx documentation builder. # # This file does only contain a selection of the most common options. For a # full list see the documentation: # http://www.sphinx-doc.org/en/master/config # -- Path setup -------------------------------------------------------------- # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. import os import sys sys.path.insert(0, os.path.abspath('..')) # -- Project information ----------------------------------------------------- project = u'logilab common' copyright = u'2019, Logilab' author = u'Logilab' # The short X.Y version version = u'' # The full version, including alpha/beta/rc tags release = u'' # -- General configuration --------------------------------------------------- # If your documentation needs a minimal Sphinx version, state it here. # # needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'sphinx.ext.coverage', 'sphinx.ext.viewcode', ] autodoc_mock_imports = ["kerberos"] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix(es) of source filenames. # You can specify multiple suffix as a list of string: # # source_suffix = ['.rst', '.md'] source_suffix = '.rst' # The master toctree document. master_doc = 'index' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # # This is also used if you do content translation via gettext catalogs. # Usually you set "language" from the command line for these cases. language = None # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. # This pattern also affects html_static_path and html_extra_path. exclude_patterns = [u'_build', 'Thumbs.db', '.DS_Store'] # The name of the Pygments (syntax highlighting) style to use. pygments_style = None # -- Options for HTML output ------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. # html_theme = 'alabaster' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. # # html_theme_options = {} # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # Custom sidebar templates, must be a dictionary that maps document names # to template names. # # The default sidebars (for documents that don't match any pattern) are # defined by theme itself. Builtin themes are using these templates by # default: ``['localtoc.html', 'relations.html', 'sourcelink.html', # 'searchbox.html']``. # # html_sidebars = {} # -- Options for HTMLHelp output --------------------------------------------- # Output file base name for HTML help builder. htmlhelp_basename = 'logilabcommondoc' # -- Options for LaTeX output ------------------------------------------------ latex_elements = { # The paper size ('letterpaper' or 'a4paper'). # # 'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). # # 'pointsize': '10pt', # Additional stuff for the LaTeX preamble. # # 'preamble': '', # Latex figure (float) alignment # # 'figure_align': 'htbp', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ (master_doc, 'logilabcommon.tex', u'logilab common Documentation', u'Logilab', 'manual'), ] # -- Options for manual page output ------------------------------------------ # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ (master_doc, 'logilabcommon', u'logilab common Documentation', [author], 1) ] # -- Options for Texinfo output ---------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ (master_doc, 'logilabcommon', u'logilab common Documentation', author, 'logilabcommon', 'One line description of project.', 'Miscellaneous'), ] # -- Options for Epub output ------------------------------------------------- # Bibliographic Dublin Core info. epub_title = project # The unique identifier of the text. This can be a ISBN number # or the project homepage. # # epub_identifier = '' # A unique identification for the text. # # epub_uid = '' # A list of files that should not be packed into the epub file. epub_exclude_files = ['search.html'] # -- Extension configuration ------------------------------------------------- # -- Options for intersphinx extension --------------------------------------- # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = {'https://docs.python.org/': None} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/index.rst0000666000000000000000000001006014762603732016035 0ustar00rootroot.. logilab common documentation master file, created by sphinx-quickstart on Thu May 23 03:36:04 2019. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. .. include:: ../README.rst Changelog --------- :ref:`changelog_doc` Provided modules ---------------- Here is a brief description of the available modules. Modules providing high-level features ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ * :ref:`cache `, a cache implementation with a least recently used algorithm. * :ref:`changelog `, a tiny library to manipulate our simplified ChangeLog file format. * :ref:`clcommands `, high-level classes to define command line programs handling different subcommands. It is based on `configuration` to get easy command line / configuration file handling. * :ref:`configuration `, some classes to handle unified configuration from both command line (using optparse) and configuration file (using ConfigParser). * :ref:`proc `, interface to Linux /proc. * :ref:`umessage `, unicode email support. * :ref:`ureports `, micro-reports, a way to create simple reports using python objects without care of the final formatting. ReST and html formatters are provided. Modules providing low-level functions and structures ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ * :ref:`compat `, provides a transparent compatibility layer between different python versions. * :ref:`date `, a set of date manipulation functions. * :ref:`daemon `, a daemon function and mix-in class to properly start an Unix daemon process. * :ref:`decorators `, function decorators such as cached, timed... * :ref:`deprecation `, decorator, metaclass & all to mark functions / classes as deprecated or moved * :ref:`fileutils `, some file / file path manipulation utilities. * :ref:`graph `, graph manipulations functions such as cycle detection, bases for dot file generation. * :ref:`modutils `, python module manipulation functions. * :ref:`shellutils `, some powerful shell like functions to replace shell scripts with python scripts. * :ref:`tasksqueue `, a prioritized tasks queue implementation. * :ref:`textutils `, some text manipulation functions (ansi colorization, line wrapping, rest support...). * :ref:`tree `, base class to represent tree structure, and some others to make it works with the visitor implementation (see below). * :ref:`visitor `, a generic visitor pattern implementation. Modules extending some standard modules ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ * :ref:`debugger `, `pdb` customization. * :ref:`logging_ext `, extensions to `logging` module such as a colorized formatter and an easier initialization function. * :ref:`optik_ext `, defines some new option types (regexp, csv, color, date, etc.) for `optik` / `optparse` Modules extending some external modules ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ * :ref:`sphinx_ext `, Sphinx_ plugin defining a `autodocstring` directive. * :ref:`vcgutils ` , utilities functions to generate file readable with Georg Sander's vcg tool (Visualization of Compiler Graphs). To be deprecated modules ~~~~~~~~~~~~~~~~~~~~~~~~ Those `logilab.common` modules will much probably be deprecated in future versions: * `testlib`: use `unittest2`_ instead * `interface`: use `zope.interface`_ if you really want this * `table`, `xmlutils`: is that used? * `sphinxutils`: we won't go that way imo (i == syt) .. _Sphinx: http://sphinx.pocoo.org/ .. _`unittest2`: http://pypi.python.org/pypi/unittest2 .. _`discover`: http://pypi.python.org/pypi/discover .. _`zope.interface`: http://pypi.python.org/pypi/zope.interface Reference ========= .. toctree:: :maxdepth: 2 :caption: Contents: logilab.common logilab.common.ureports Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/logilab-pytest.10000666000000000000000000000253714762603732017227 0ustar00rootroot.TH logilab-pytest "1" "January 2008" logilab-pytest .SH NAME .B logilab-pytest \- run python unit tests .SH SYNOPSIS usage: logilab-pytest [OPTIONS] [testfile [testpattern]] .PP examples: .PP logilab-pytest path/to/mytests.py logilab-pytest path/to/mytests.py TheseTests logilab-pytest path/to/mytests.py TheseTests.test_thisone .PP logilab-pytest one (will run both test_thisone and test_thatone) logilab-pytest path/to/mytests.py \fB\-s\fR not (will skip test_notthisone) .PP logilab-pytest \fB\-\-coverage\fR test_foo.py .IP (only if logilab.devtools is available) .SS "options:" .TP \fB\-h\fR, \fB\-\-help\fR show this help message and exit .TP \fB\-t\fR TESTDIR directory where the tests will be found .TP \fB\-d\fR enable design\-by\-contract .TP \fB\-v\fR, \fB\-\-verbose\fR Verbose output .TP \fB\-i\fR, \fB\-\-pdb\fR Enable test failure inspection (conflicts with \fB\-\-coverage\fR) .TP \fB\-x\fR, \fB\-\-exitfirst\fR Exit on first failure (only make sense when logilab-pytest run one test file) .TP \fB\-s\fR SKIPPED, \fB\-\-skip\fR=\fISKIPPED\fR test names matching this name will be skipped to skip several patterns, use commas .TP \fB\-q\fR, \fB\-\-quiet\fR Minimal output .TP \fB\-P\fR PROFILE, \fB\-\-profile\fR=\fIPROFILE\fR Profile execution and store data in the given file .TP \fB\-\-coverage\fR run tests with pycoverage (conflicts with \fB\-\-pdb\fR) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/logilab.common.rst0000666000000000000000000001370614762603732017640 0ustar00rootrootlogilab.common package ====================== Subpackages ----------- .. toctree:: logilab.common.ureports Submodules ---------- .. _cache: logilab.common.cache module --------------------------- .. automodule:: logilab.common.cache :members: :undoc-members: :show-inheritance: .. _changelog: logilab.common.changelog module ------------------------------- .. automodule:: logilab.common.changelog :members: :undoc-members: :show-inheritance: .. _clcommands: logilab.common.clcommands module -------------------------------- .. automodule:: logilab.common.clcommands :members: :undoc-members: :show-inheritance: .. _compat: logilab.common.compat module ---------------------------- .. automodule:: logilab.common.compat :members: :undoc-members: :show-inheritance: .. _configuration: logilab.common.configuration module ----------------------------------- .. automodule:: logilab.common.configuration :members: :undoc-members: :show-inheritance: .. _daemon: logilab.common.daemon module ---------------------------- .. automodule:: logilab.common.daemon :members: :undoc-members: :show-inheritance: .. _date: logilab.common.date module -------------------------- .. automodule:: logilab.common.date :members: :undoc-members: :show-inheritance: .. _debugger: logilab.common.debugger module ------------------------------ .. automodule:: logilab.common.debugger :members: :undoc-members: :show-inheritance: .. _decorators: logilab.common.decorators module -------------------------------- .. automodule:: logilab.common.decorators :members: :undoc-members: :show-inheritance: .. _deprecation: logilab.common.deprecation module --------------------------------- .. automodule:: logilab.common.deprecation :members: :undoc-members: :show-inheritance: .. _fileutils: logilab.common.fileutils module ------------------------------- .. automodule:: logilab.common.fileutils :members: :undoc-members: :show-inheritance: .. _graph: logilab.common.graph module --------------------------- .. automodule:: logilab.common.graph :members: :undoc-members: :show-inheritance: .. _interface: logilab.common.interface module ------------------------------- .. automodule:: logilab.common.interface :members: :undoc-members: :show-inheritance: .. _logging_ext: logilab.common.logging\_ext module ---------------------------------- .. automodule:: logilab.common.logging_ext :members: :undoc-members: :show-inheritance: .. _modutils: logilab.common.modutils module ------------------------------ .. automodule:: logilab.common.modutils :members: :undoc-members: :show-inheritance: .. _optik_ext: logilab.common.optik\_ext module -------------------------------- .. automodule:: logilab.common.optik_ext :members: :undoc-members: :show-inheritance: .. _optparser: logilab.common.optparser module ------------------------------- .. automodule:: logilab.common.optparser :members: :undoc-members: :show-inheritance: .. _proc: logilab.common.proc module -------------------------- .. automodule:: logilab.common.proc :members: :undoc-members: :show-inheritance: .. _pytest: logilab.common.pytest module ---------------------------- .. automodule:: logilab.common.pytest :members: :undoc-members: :show-inheritance: .. _registry: logilab.common.registry module ------------------------------ .. automodule:: logilab.common.registry :members: :undoc-members: :show-inheritance: .. _shellutils: logilab.common.shellutils module -------------------------------- .. automodule:: logilab.common.shellutils :members: :undoc-members: :show-inheritance: .. _sphinx_ext: logilab.common.sphinx\_ext module --------------------------------- .. automodule:: logilab.common.sphinx_ext :members: :undoc-members: :show-inheritance: .. _sphinxutils: logilab.common.sphinxutils module --------------------------------- .. automodule:: logilab.common.sphinxutils :members: :undoc-members: :show-inheritance: .. _table: logilab.common.table module --------------------------- .. automodule:: logilab.common.table :members: :undoc-members: :show-inheritance: .. _tasksqueue: logilab.common.tasksqueue module -------------------------------- .. automodule:: logilab.common.tasksqueue :members: :undoc-members: :show-inheritance: .. _testlib: logilab.common.testlib module ----------------------------- .. automodule:: logilab.common.testlib :members: :undoc-members: :show-inheritance: .. _textutils: logilab.common.textutils module ------------------------------- .. automodule:: logilab.common.textutils :members: :undoc-members: :show-inheritance: .. _tree: logilab.common.tree module -------------------------- .. automodule:: logilab.common.tree :members: :undoc-members: :show-inheritance: .. _umessage: logilab.common.umessage module ------------------------------ .. automodule:: logilab.common.umessage :members: :undoc-members: :show-inheritance: .. _urllib2ext: logilab.common.urllib2ext module -------------------------------- .. automodule:: logilab.common.urllib2ext :members: :undoc-members: :show-inheritance: .. _vcgutils: logilab.common.vcgutils module ------------------------------ .. automodule:: logilab.common.vcgutils :members: :undoc-members: :show-inheritance: .. _visitor: logilab.common.visitor module ----------------------------- .. automodule:: logilab.common.visitor :members: :undoc-members: :show-inheritance: .. _xmlutils: logilab.common.xmlutils module ------------------------------ .. automodule:: logilab.common.xmlutils :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: logilab.common :members: :undoc-members: :show-inheritance: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/logilab.common.ureports.rst0000666000000000000000000000177314762603732021523 0ustar00rootroot.. _ureports: logilab.common.ureports package =============================== Submodules ---------- logilab.common.ureports.docbook\_writer module ---------------------------------------------- .. automodule:: logilab.common.ureports.docbook_writer :members: :undoc-members: :show-inheritance: logilab.common.ureports.html\_writer module ------------------------------------------- .. automodule:: logilab.common.ureports.html_writer :members: :undoc-members: :show-inheritance: logilab.common.ureports.nodes module ------------------------------------ .. automodule:: logilab.common.ureports.nodes :members: :undoc-members: :show-inheritance: logilab.common.ureports.text\_writer module ------------------------------------------- .. automodule:: logilab.common.ureports.text_writer :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: logilab.common.ureports :members: :undoc-members: :show-inheritance: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/logilab.rst0000666000000000000000000000031614762603732016342 0ustar00rootrootlogilab package =============== Subpackages ----------- .. toctree:: logilab.common Module contents --------------- .. automodule:: logilab :members: :undoc-members: :show-inheritance: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/make.bat0000666000000000000000000000136314762603732015607 0ustar00rootroot at ECHO OFF pushd %~dp0 REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set SOURCEDIR=. set BUILDDIR=_build if "%1" == "" goto help %SPHINXBUILD% >NUL 2>NUL if errorlevel 9009 ( echo. echo.The 'sphinx-build' command was not found. Make sure you have Sphinx echo.installed, then set the SPHINXBUILD environment variable to point echo.to the full path of the 'sphinx-build' executable. Alternatively you echo.may add the Sphinx directory to PATH. echo. echo.If you don't have Sphinx installed, grab it from echo.http://sphinx-doc.org/ exit /b 1 ) %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% goto end :help %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% :end popd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/modules.rst0000666000000000000000000000007214762603732016400 0ustar00rootrootlogilab ======= .. toctree:: :maxdepth: 4 logilab ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/docs/requirements-doc.txt0000666000000000000000000000007114762603732020224 0ustar00rootrootsphinx importlib_metadata>=6,<7; python_version < "3.10" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.8501818 logilab_common-2.1.0/logilab/0000755000000000000000000000000014762603767014664 5ustar00rootroot././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.8861825 logilab_common-2.1.0/logilab/common/0000755000000000000000000000000014762603767016154 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/__init__.py0000666000000000000000000001276014762603732020267 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Logilab common library (aka Logilab's extension to the standard library). :type STD_BLACKLIST: tuple :var STD_BLACKLIST: directories ignored by default by the functions in this package which have to recurse into directories :type IGNORED_EXTENSIONS: tuple :var IGNORED_EXTENSIONS: file extensions that may usually be ignored """ __docformat__ = "restructuredtext en" import sys import types if sys.version_info < (3, 10): from importlib_metadata import version else: from importlib.metadata import version from typing import List, Sequence __version__ = version("logilab-common") # deprecated, but keep compatibility with pylint < 1.4.4 __pkginfo__ = types.ModuleType("__pkginfo__") __pkginfo__.__package__ = __name__ # mypy output: Module has no attribute "version" # logilab's magic __pkginfo__.version = __version__ # type: ignore sys.modules["logilab.common.__pkginfo__"] = __pkginfo__ STD_BLACKLIST = ("CVS", ".svn", ".hg", ".git", ".tox", "debian", "dist", "build") IGNORED_EXTENSIONS = (".pyc", ".pyo", ".elc", "~", ".swp", ".orig") # set this to False if you've mx DateTime installed but you don't want your db # adapter to use it (should be set before you got a connection) USE_MX_DATETIME = True class attrdict(dict): """A dictionary for which keys are also accessible as attributes.""" def __getattr__(self, attr: str) -> str: try: return self[attr] except KeyError: raise AttributeError(attr) class dictattr(dict): def __init__(self, proxy): self.__proxy = proxy def __getitem__(self, attr): try: return getattr(self.__proxy, attr) except AttributeError: raise KeyError(attr) class nullobject: def __repr__(self): return "" def __bool__(self): return False __nonzero__ = __bool__ class tempattr: def __init__(self, obj, attr, value): self.obj = obj self.attr = attr self.value = value def __enter__(self): self.oldvalue = getattr(self.obj, self.attr) setattr(self.obj, self.attr, self.value) return self.obj def __exit__(self, exctype, value, traceback): setattr(self.obj, self.attr, self.oldvalue) # flatten ----- # XXX move in a specific module and use yield instead # do not mix flatten and translate # # def iterable(obj): # try: iter(obj) # except: return False # return True # # def is_string_like(obj): # try: obj +'' # except (TypeError, ValueError): return False # return True # # def is_scalar(obj): # return is_string_like(obj) or not iterable(obj) # # def flatten(seq): # for item in seq: # if is_scalar(item): # yield item # else: # for subitem in flatten(item): # yield subitem def flatten(iterable, tr_func=None, results=None): """Flatten a list of list with any level. If tr_func is not None, it should be a one argument function that'll be called on each final element. :rtype: list >>> flatten([1, [2, 3]]) [1, 2, 3] """ if results is None: results = [] for val in iterable: if isinstance(val, (list, tuple)): flatten(val, tr_func, results) elif tr_func is None: results.append(val) else: results.append(tr_func(val)) return results # XXX is function below still used ? def make_domains(lists): """ Given a list of lists, return a list of domain for each list to produce all combinations of possibles values. :rtype: list Example: >>> make_domains(['a', 'b'], ['c','d', 'e']) [['a', 'b', 'a', 'b', 'a', 'b'], ['c', 'c', 'd', 'd', 'e', 'e']] """ domains = [] for iterable in lists: new_domain = iterable[:] for i in range(len(domains)): domains[i] = domains[i] * len(iterable) if domains: missing = (len(domains[0]) - len(iterable)) / len(iterable) i = 0 for j in range(len(iterable)): value = iterable[j] for dummy in range(missing): new_domain.insert(i, value) i += 1 i += 1 domains.append(new_domain) return domains # private stuff ################################################################ def _handle_blacklist(blacklist: Sequence[str], dirnames: List[str], filenames: List[str]) -> None: """remove files/directories in the black list dirnames/filenames are usually from os.walk """ for norecurs in blacklist: if norecurs in dirnames: dirnames.remove(norecurs) elif norecurs in filenames: filenames.remove(norecurs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/cache.py0000666000000000000000000000731314762603732017571 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Cache module, with a least recently used algorithm for the management of the deletion of entries. """ __docformat__ = "restructuredtext en" from threading import Lock from logilab.common.decorators import locked from typing import TypeVar, List _marker = object() _KeyType = TypeVar("_KeyType") class Cache(dict): """A dictionary like cache. inv: len(self._usage) <= self.size len(self.data) <= self.size """ def __init__(self, size: int = 100) -> None: """Warning : Cache.__init__() != dict.__init__(). Constructor does not take any arguments beside size. """ assert size >= 0, "cache size must be >= 0 (0 meaning no caching)" self.size = size self._usage: List = [] self._lock = Lock() super(Cache, self).__init__() def _acquire(self) -> None: self._lock.acquire() def _release(self) -> None: self._lock.release() def _update_usage(self, key: _KeyType) -> None: if not self._usage: self._usage.append(key) elif self._usage[-1] != key: try: self._usage.remove(key) except ValueError: # we are inserting a new key # check the size of the dictionary # and remove the oldest item in the cache if self.size and len(self._usage) >= self.size: super(Cache, self).__delitem__(self._usage[0]) del self._usage[0] self._usage.append(key) else: pass # key is already the most recently used key def __getitem__(self, key: _KeyType): value = super(Cache, self).__getitem__(key) self._update_usage(key) return value __getitem__ = locked(_acquire, _release)(__getitem__) def __setitem__(self, key: _KeyType, item): # Just make sure that size > 0 before inserting a new item in the cache if self.size > 0: super(Cache, self).__setitem__(key, item) self._update_usage(key) __setitem__ = locked(_acquire, _release)(__setitem__) def __delitem__(self, key: _KeyType): super(Cache, self).__delitem__(key) self._usage.remove(key) __delitem__ = locked(_acquire, _release)(__delitem__) def clear(self): super(Cache, self).clear() self._usage = [] clear = locked(_acquire, _release)(clear) def pop(self, key: _KeyType, default=_marker): if key in self: self._usage.remove(key) # if default is _marker: # return super(Cache, self).pop(key) return super(Cache, self).pop(key, default) pop = locked(_acquire, _release)(pop) def popitem(self): raise NotImplementedError() def setdefault(self, key, default=None): raise NotImplementedError() def update(self, other): raise NotImplementedError() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/changelog.py0000666000000000000000000002062314762603732020454 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) # any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with logilab-common. If not, see . """Manipulation of upstream change log files. The upstream change log files format handled is simpler than the one often used such as those generated by the default Emacs changelog mode. Sample ChangeLog format:: Change log for project Yoo ========================== -- * add a new functionality 2002-02-01 -- 0.1.1 * fix bug #435454 * fix bug #434356 2002-01-01 -- 0.1 * initial release There is 3 entries in this change log, one for each released version and one for the next version (i.e. the current entry). Each entry contains a set of messages corresponding to changes done in this release. All the non empty lines before the first entry are considered as the change log title. """ __docformat__ = "restructuredtext en" import sys from stat import S_IWRITE import codecs from typing import List, Any, Optional, Tuple from _io import StringIO BULLET = "*" SUBBULLET = "-" INDENT = " " * 4 class NoEntry(Exception): """raised when we are unable to find an entry""" class EntryNotFound(Exception): """raised when we are unable to find a given entry""" class Version(tuple): """simple class to handle soft version number has a tuple while correctly printing it as X.Y.Z """ def __new__(cls, versionstr): if isinstance(versionstr, str): versionstr = versionstr.strip(" :") # XXX (syt) duh? parsed = cls.parse(versionstr) else: parsed = versionstr return tuple.__new__(cls, parsed) @classmethod def parse(cls, versionstr: str) -> List[int]: versionstr = versionstr.strip(" :") try: return [int(i) for i in versionstr.split(".")] except ValueError as ex: raise ValueError(f"invalid literal for version '{versionstr}' ({ex})") def __str__(self) -> str: return ".".join([str(i) for i in self]) # upstream change log ######################################################### class ChangeLogEntry: """a change log entry, i.e. a set of messages associated to a version and its release date """ version_class = Version def __init__( self, date: Optional[str] = None, version: Optional[str] = None, **kwargs: Any ) -> None: self.__dict__.update(kwargs) self.version: Optional[Version] if version: self.version = self.version_class(version) else: self.version = None self.date = date self.messages: List[Tuple[List[str], List[List[str]]]] = [] def add_message(self, msg: str) -> None: """add a new message""" self.messages.append(([msg], [])) def complete_latest_message(self, msg_suite: str) -> None: """complete the latest added message""" if not self.messages: raise ValueError("unable to complete last message as " "there is no previous message)") if self.messages[-1][1]: # sub messages self.messages[-1][1][-1].append(msg_suite) else: # message self.messages[-1][0].append(msg_suite) def add_sub_message(self, sub_msg: str, key: Optional[Any] = None) -> None: if not self.messages: raise ValueError("unable to complete last message as " "there is no previous message)") if key is None: self.messages[-1][1].append([sub_msg]) else: raise NotImplementedError("sub message to specific key " "are not implemented yet") def write(self, stream: StringIO = sys.stdout) -> None: """write the entry to file""" stream.write(f"{self.date or ''} -- {self.version or ''}\n") for msg, sub_msgs in self.messages: stream.write(f"{INDENT}{BULLET} {msg[0]}\n") stream.write("".join(msg[1:])) if sub_msgs: stream.write("\n") for sub_msg in sub_msgs: stream.write(f"{INDENT * 2}{SUBBULLET} {sub_msg[0]}\n") stream.write("".join(sub_msg[1:])) stream.write("\n") stream.write("\n\n") class ChangeLog: """object representation of a whole ChangeLog file""" entry_class = ChangeLogEntry def __init__(self, changelog_file: str, title: str = "") -> None: self.file = changelog_file assert isinstance(title, type("")), "title must be a unicode object" self.title = title self.additional_content = "" self.entries: List[ChangeLogEntry] = [] self.load() def __repr__(self): return f"" def add_entry(self, entry: ChangeLogEntry) -> None: """add a new entry to the change log""" self.entries.append(entry) def get_entry(self, version="", create=None): """return a given changelog entry if version is omitted, return the current entry """ if not self.entries: if version or not create: raise NoEntry() self.entries.append(self.entry_class()) if not version: if self.entries[0].version and create is not None: self.entries.insert(0, self.entry_class()) return self.entries[0] version = self.version_class(version) for entry in self.entries: if entry.version == version: return entry raise EntryNotFound() def add(self, msg, create=None): """add a new message to the latest opened entry""" entry = self.get_entry(create=create) entry.add_message(msg) def load(self) -> None: """read a logilab's ChangeLog from file""" try: stream = codecs.open(self.file, encoding="utf-8") except IOError: return last: Optional[ChangeLogEntry] = None expect_sub = False for line in stream: sline = line.strip() words = sline.split() # if new entry if len(words) == 1 and words[0] == "--": expect_sub = False last = self.entry_class() self.add_entry(last) # if old entry elif len(words) == 3 and words[1] == "--": expect_sub = False last = self.entry_class(words[0], words[2]) self.add_entry(last) # if title elif sline and last is None: self.title = f"{self.title}{line}" # if new entry elif sline and sline[0] == BULLET: expect_sub = False assert last is not None last.add_message(sline[1:].strip()) # if new sub_entry elif expect_sub and sline and sline[0] == SUBBULLET: assert last is not None last.add_sub_message(sline[1:].strip()) # if new line for current entry elif sline and (last and last.messages): last.complete_latest_message(line) else: expect_sub = True self.additional_content += line stream.close() def format_title(self) -> str: return f"{self.title.strip()}\n\n" def save(self): """write back change log""" # filetutils isn't importable in appengine, so import locally from logilab.common.fileutils import ensure_fs_mode ensure_fs_mode(self.file, S_IWRITE) self.write(codecs.open(self.file, "w", encoding="utf-8")) def write(self, stream: StringIO = sys.stdout) -> None: """write changelog to stream""" stream.write(self.format_title()) for entry in self.entries: entry.write(stream) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/clcommands.py0000666000000000000000000002372414762603732020652 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Helper functions to support command line tools providing more than one command. e.g called as "tool command [options] args..." where and are command'specific """ __docformat__ = "restructuredtext en" import sys import logging from os.path import basename from logilab.common.configuration import Configuration from logilab.common.logging_ext import init_log, get_threshold class BadCommandUsage(Exception): """Raised when an unknown command is used or when a command is not correctly used (bad options, too much / missing arguments...). Trigger display of command usage. """ class CommandError(Exception): """Raised when a command can't be processed and we want to display it and exit, without traceback nor usage displayed. """ # command line access point #################################################### class CommandLine(dict): """Usage: >>> LDI = cli.CommandLine('ldi', doc='Logilab debian installer', version=version, rcfile=RCFILE) >>> LDI.register(MyCommandClass) >>> LDI.register(MyOtherCommandClass) >>> LDI.run(sys.argv[1:]) Arguments: * `pgm`, the program name, default to `basename(sys.argv[0])` * `doc`, a short description of the command line tool * `copyright`, additional doc string that will be appended to the generated doc * `version`, version number of string of the tool. If specified, global --version option will be available. * `rcfile`, path to a configuration file. If specified, global --C/--rc-file option will be available? self.rcfile = rcfile * `logger`, logger to propagate to commands, default to `logging.getLogger(self.pgm))` """ def __init__( self, pgm=None, doc=None, copyright=None, version=None, rcfile=None, logthreshold=logging.ERROR, check_duplicated_command=True, ): if pgm is None: pgm = basename(sys.argv[0]) self.pgm = pgm self.doc = doc self.copyright = copyright self.version = version self.rcfile = rcfile self.logger = None self.logthreshold = logthreshold self.check_duplicated_command = check_duplicated_command def register(self, cls, force=False): """register the given :class:`Command` subclass""" assert ( not self.check_duplicated_command or force or cls.name not in self ), f"a command {cls.name} is already defined" self[cls.name] = cls return cls def run(self, args): """main command line access point: * init logging * handle global options (-h/--help, --version, -C/--rc-file) * check command * run command Terminate by :exc:`SystemExit` """ init_log( debug=True, # so that we use StreamHandler logthreshold=self.logthreshold, logformat="%(levelname)s: %(message)s", ) try: arg = args.pop(0) except IndexError: self.usage_and_exit(1) if arg in ("-h", "--help"): self.usage_and_exit(0) if self.version is not None and arg in ("--version"): print(self.version) sys.exit(0) rcfile = self.rcfile if rcfile is not None and arg in ("-C", "--rc-file"): try: rcfile = args.pop(0) arg = args.pop(0) except IndexError: self.usage_and_exit(1) try: command = self.get_command(arg) except KeyError: print(f"ERROR: no {arg} command") print() self.usage_and_exit(1) try: sys.exit(command.main_run(args, rcfile)) except KeyboardInterrupt as exc: print("Interrupted", end=" ") if str(exc): print(f": {exc}", end=" ") print() sys.exit(4) except BadCommandUsage as err: print("ERROR:", err) print() print(command.help()) sys.exit(1) def create_logger(self, handler, logthreshold=None): logger = logging.Logger(self.pgm) logger.handlers = [handler] if logthreshold is None: logthreshold = get_threshold(self.logthreshold) logger.setLevel(logthreshold) return logger def get_command(self, cmd, logger=None): if logger is None: logger = self.logger if logger is None: logger = self.logger = logging.getLogger(self.pgm) logger.setLevel(get_threshold(self.logthreshold)) return self[cmd](logger) def usage(self): """display usage for the main program (i.e. when no command supplied) and exit """ print("usage:", self.pgm, end=" ") if self.rcfile: print("[--rc-file=]", end=" ") print(" [options] ...") if self.doc: print(f"\n{self.doc}") print( """ Type "%(pgm)s --help" for more information about a specific command. Available commands are :\n""" % self.__dict__ ) max_len = max([len(cmd) for cmd in self]) padding = " " * max_len for cmdname, cmd in sorted(self.items()): if not cmd.hidden: print(" ", (cmdname + padding)[:max_len], cmd.short_description()) if self.rcfile: print( """ Use --rc-file= / -C before the command to specify a configuration file. Default to %s. """ % self.rcfile ) print( f"""{self.__dict__['pgm']} -h/--help display this usage information and exit""" ) if self.version: print( f"""{self.__dict__['pgm']} -v/--version display version configuration and exit""" ) if self.copyright: print("\n", self.copyright) def usage_and_exit(self, status): self.usage() sys.exit(status) # base command classes ######################################################### class Command(Configuration): """Base class for command line commands. Class attributes: * `name`, the name of the command * `min_args`, minimum number of arguments, None if unspecified * `max_args`, maximum number of arguments, None if unspecified * `arguments`, string describing arguments, used in command usage * `hidden`, boolean flag telling if the command should be hidden, e.g. does not appear in help's commands list * `options`, options list, as allowed by :mod:configuration """ arguments = "" name = "" # hidden from help ? hidden = False # max/min args, None meaning unspecified min_args = None max_args = None @classmethod def description(cls): return cls.__doc__.replace(" ", "") @classmethod def short_description(cls): return cls.description().split(".")[0] def __init__(self, logger): usage = f"%prog {self.name} {self.arguments}\n\n{self.description()}" Configuration.__init__(self, usage=usage) self.logger = logger def check_args(self, args): """check command's arguments are provided""" if self.min_args is not None and len(args) < self.min_args: raise BadCommandUsage("missing argument") if self.max_args is not None and len(args) > self.max_args: raise BadCommandUsage("too many arguments") def main_run(self, args, rcfile=None): """Run the command and return status 0 if everything went fine. If :exc:`CommandError` is raised by the underlying command, simply log the error and return status 2. Any other exceptions, including :exc:`BadCommandUsage` will be propagated. """ if rcfile: self.load_file_configuration(rcfile) args = self.load_command_line_configuration(args) try: self.check_args(args) self.run(args) except CommandError as err: self.logger.error(err) return 2 return 0 def run(self, args): """run the command with its specific arguments""" raise NotImplementedError() class ListCommandsCommand(Command): """list available commands, useful for bash completion.""" name = "listcommands" arguments = "[command]" hidden = True def run(self, args): """run the command with its specific arguments""" if args: command = args.pop() cmd = _COMMANDS[command] for optname, optdict in cmd.options: print("--help") print("--" + optname) else: commands = sorted(_COMMANDS.keys()) for command in commands: cmd = _COMMANDS[command] if not cmd.hidden: print(command) _COMMANDS = CommandLine() DEFAULT_COPYRIGHT = """\ Copyright (c) 2004-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. http://www.logilab.fr/ -- mailto:contact@logilab.fr""" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/compat.py0000666000000000000000000000276514762603732020017 0ustar00rootroot# pylint: disable=E0601,W0622,W0611 # copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Wrappers around some builtins introduced in python 2.3, 2.4 and 2.5, making them available in for earlier versions of python. See another compatibility snippets from other projects: :mod:`lib2to3.fixes` :mod:`coverage.backward` :mod:`unittest2.compatibility` """ __docformat__ = "restructuredtext en" import types # not used here, but imported to preserve API import builtins # noqa def str_to_bytes(string): return str.encode(string) # See also http://bugs.python.org/issue11776 def method_type(callable, instance, klass): # api change. klass is no more considered return types.MethodType(callable, instance) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/configuration.py0000666000000000000000000013227614762603732021404 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Classes to handle advanced configuration in simple to complex applications. Allows to load the configuration from a file or from command line options, to generate a sample configuration file or to display program's usage. Fills the gap between optik/optparse and ConfigParser by adding data types (which are also available as a standalone optik extension in the `optik_ext` module). Quick start: simplest usage --------------------------- .. :: >>> import sys >>> from logilab.common.configuration import Configuration >>> options = [('dothis', {'type':'yn', 'default': True, 'metavar': ''}), ... ('value', {'type': 'string', 'metavar': ''}), ... ('multiple', {'type': 'csv', 'default': ('yop',), ... 'metavar': '', ... 'help': 'you can also document the option'}), ... ('number', {'type': 'int', 'default':2, 'metavar':''}), ... ] >>> config = Configuration(options=options, name='My config') >>> print config['dothis'] True >>> print config['value'] None >>> print config['multiple'] ('yop',) >>> print config['number'] 2 >>> print config.help() Usage: [options] Options: -h, --help show this help message and exit --dothis= --value= --multiple= you can also document the option [current: none] --number= >>> f = open('myconfig.ini', 'w') >>> f.write('''[MY CONFIG] ... number = 3 ... dothis = no ... multiple = 1,2,3 ... ''') >>> f.close() >>> config.load_file_configuration('myconfig.ini') >>> print config['dothis'] False >>> print config['value'] None >>> print config['multiple'] ['1', '2', '3'] >>> print config['number'] 3 >>> sys.argv = ['mon prog', '--value', 'bacon', '--multiple', '4,5,6', ... 'nonoptionargument'] >>> print config.load_command_line_configuration() ['nonoptionargument'] >>> print config['value'] bacon >>> config.generate_config() # class for simple configurations which don't need the # manager / providers model and prefer delegation to inheritance # # configuration values are accessible through a dict like interface # [MY CONFIG] dothis=no value=bacon # you can also document the option multiple=4,5,6 number=3 Note : starting with Python 2.7 ConfigParser is able to take into account the order of occurrences of the options into a file (by using an OrderedDict). If you have two options changing some common state, like a 'disable-all-stuff' and a 'enable-some-stuff-a', their order of appearance will be significant : the last specified in the file wins. For earlier version of python and logilab.common newer than 0.61 the behaviour is unspecified. """ __docformat__ = "restructuredtext en" __all__ = ( "OptionsManagerMixIn", "OptionsProviderMixIn", "ConfigurationMixIn", "Configuration", "OptionsManager2ConfigurationAdapter", ) import os import sys import re from difflib import get_close_matches from os.path import exists, expanduser from optparse import OptionGroup from copy import copy from _io import StringIO, TextIOWrapper from typing import Any, Optional, Union, Dict, List, Tuple, Iterator, Callable import configparser as cp from logilab.common.types import OptionParser, Option, attrdict from logilab.common.textutils import normalize_text, unquote from logilab.common import optik_ext OptionError = optik_ext.OptionError REQUIRED: List = [] class UnsupportedAction(Exception): """raised by set_option when it doesn't know what to do for an action""" def _get_encoding(encoding: Optional[str], stream: Union[StringIO, TextIOWrapper]) -> str: encoding = encoding or getattr(stream, "encoding", None) if not encoding: import locale encoding = locale.getpreferredencoding() return encoding _ValueType = Union[List[str], Tuple[str, ...], str] # validation functions ######################################################## # validators will return the validated value or raise optparse.OptionValueError # XXX add to documentation def choice_validator(optdict: Dict[str, Any], name: str, value: str) -> str: """validate and return a converted value for option of type 'choice'""" if value not in optdict["choices"]: msg = "option %s: invalid value: %r, should be in %s" raise optik_ext.OptionValueError(msg % (name, value, optdict["choices"])) return value def multiple_choice_validator(optdict: Dict[str, Any], name: str, value: _ValueType) -> _ValueType: """validate and return a converted value for option of type 'choice'""" choices = optdict["choices"] values = optik_ext.check_csv(None, name, value) for value in values: if value not in choices: msg = "option %s: invalid value: %r, should be in %s" raise optik_ext.OptionValueError(msg % (name, value, choices)) return values def csv_validator(optdict: Dict[str, Any], name: str, value: _ValueType) -> _ValueType: """validate and return a converted value for option of type 'csv'""" return optik_ext.check_csv(None, name, value) def yn_validator(optdict: Dict[str, Any], name: str, value: Union[bool, str]) -> bool: """validate and return a converted value for option of type 'yn'""" return optik_ext.check_yn(None, name, value) def named_validator( optdict: Dict[str, Any], name: str, value: Union[Dict[str, str], str] ) -> Dict[str, str]: """validate and return a converted value for option of type 'named'""" return optik_ext.check_named(None, name, value) def file_validator(optdict, name, value): """validate and return a filepath for option of type 'file'""" return optik_ext.check_file(None, name, value) def color_validator(optdict, name, value): """validate and return a valid color for option of type 'color'""" return optik_ext.check_color(None, name, value) def password_validator(optdict, name, value): """validate and return a string for option of type 'password'""" return optik_ext.check_password(None, name, value) def date_validator(optdict, name, value): """validate and return a mx DateTime object for option of type 'date'""" return optik_ext.check_date(None, name, value) def time_validator(optdict, name, value): """validate and return a time object for option of type 'time'""" return optik_ext.check_time(None, name, value) def bytes_validator(optdict: Dict[str, str], name: str, value: Union[int, str]) -> int: """validate and return an integer for option of type 'bytes'""" return optik_ext.check_bytes(None, name, value) VALIDATORS: Dict[str, Callable] = { "string": unquote, "int": int, "float": float, "file": file_validator, "font": unquote, "color": color_validator, "regexp": re.compile, "csv": csv_validator, "yn": yn_validator, "bool": yn_validator, "named": named_validator, "password": password_validator, "date": date_validator, "time": time_validator, "bytes": bytes_validator, "choice": choice_validator, "multiple_choice": multiple_choice_validator, } def _call_validator( opttype: str, optdict: Dict[str, Any], option: str, value: Union[List[str], int, str] ) -> Union[List[str], int, str]: if opttype not in VALIDATORS: all_validators = "\n - ".join(sorted(VALIDATORS.keys())) raise Exception(f'Unsupported type "{opttype}", supported types are:\n - {all_validators}') try: return VALIDATORS[opttype](optdict, option, value) except TypeError: try: return VALIDATORS[opttype](value) except optik_ext.OptionValueError: raise except Exception: raise optik_ext.OptionValueError( f"{option} value ({value!r}) should be of type {opttype}" ) # user input functions ######################################################## # user input functions will ask the user for input on stdin then validate # the result and return the validated value or raise optparse.OptionValueError # XXX add to documentation def input_password(optdict, question="password:"): from getpass import getpass while True: value = getpass(question) value2 = getpass("confirm: ") if value == value2: return value print("password mismatch, try again") def input_string(optdict, question): value = input(question).strip() return value or None def _make_input_function(opttype): def input_validator(optdict, question): while True: value = input(question) if not value.strip(): return None try: return _call_validator(opttype, optdict, None, value) except optik_ext.OptionValueError as ex: msg = str(ex).split(":", 1)[-1].strip() print(f"bad value: {msg}") return input_validator INPUT_FUNCTIONS: Dict[str, Callable] = { "string": input_string, "password": input_password, } for opttype in VALIDATORS.keys(): INPUT_FUNCTIONS.setdefault(opttype, _make_input_function(opttype)) # utility functions ############################################################ def expand_default(self, option): """monkey patch OptionParser.expand_default since we have a particular way to handle defaults to avoid overriding values in the configuration file """ if self.parser is None or not self.default_tag: return option.help optname = option._long_opts[0][2:] try: provider = self.parser.options_manager._all_options[optname] except KeyError: value = None else: optdict = provider.get_option_def(optname) optname = provider.option_attrname(optname, optdict) value = getattr(provider.config, optname, optdict) value = format_option_value(optdict, value) if value is optik_ext.NO_DEFAULT or not value: value = self.NO_DEFAULT_VALUE return option.help.replace(self.default_tag, str(value)) def _validate( value: Union[List[str], int, str], optdict: Dict[str, Any], name: str = "" ) -> Union[List[str], int, str]: """return a validated value for an option according to its type optional argument name is only used for error message formatting """ try: _type = optdict["type"] except KeyError: # FIXME return value return _call_validator(_type, optdict, name, value) # format and output functions ################################################## def comment(string): """return string as a comment""" lines = [line.strip() for line in string.splitlines()] return "# " + f"{os.linesep}# ".join(lines) def format_time(value): if not value: return "0" if value != int(value): return f"{value:.2f}s" value = int(value) nbmin, nbsec = divmod(value, 60) if nbsec: return f"{value}s" nbhour, nbmin_ = divmod(nbmin, 60) if nbmin_: return f"{nbmin}min" nbday, nbhour_ = divmod(nbhour, 24) if nbhour_: return f"{nbhour}h" return f"{nbday}d" def format_bytes(value: int) -> str: if not value: return "0" if value != int(value): return f"{value:.2f}B" value = int(value) prevunit = "B" for unit in ("KB", "MB", "GB", "TB"): next, remain = divmod(value, 1024) if remain: return f"{value}{prevunit}" prevunit = unit value = next return f"{value}{unit}" def format_option_value(optdict: Dict[str, Any], value: Any) -> Union[None, int, str]: """return the user input's value from a 'compiled' value""" if isinstance(value, (list, tuple)): value = ",".join(value) elif isinstance(value, dict): value = ",".join([f"{k}:{v}" for k, v in value.items()]) elif hasattr(value, "match"): # optdict.get('type') == 'regexp' # compiled regexp value = value.pattern elif optdict.get("type") == "yn": value = value and "yes" or "no" elif isinstance(value, str) and value.isspace(): value = f"'{value}'" elif optdict.get("type") == "time" and isinstance(value, (float, int)): value = format_time(value) elif optdict.get("type") == "bytes" and hasattr(value, "__int__"): value = format_bytes(value) return value def ini_format_section( stream: Union[StringIO, TextIOWrapper], section: str, options: Any, encoding: str = None, doc: Optional[Any] = None, ) -> None: """format an options section using the INI format""" encoding = _get_encoding(encoding, stream) if doc: print(str(comment(doc), encoding), file=stream) print(f"[{section}]", file=stream) ini_format(stream, options, encoding) def ini_format(stream: Union[StringIO, TextIOWrapper], options: Any, encoding: str) -> None: """format options using the INI format""" for optname, optdict, value in options: value = format_option_value(optdict, value) help = optdict.get("help") if help: help = normalize_text(help, line_len=79, indent="# ") print(file=stream) print(str(help), file=stream) else: print(file=stream) if value is None: print(f"#{optname}=", file=stream) else: value = str(value).strip() if optdict.get("type") == "string" and "\n" in value: prefix = "\n " value = prefix + prefix.join(value.split("\n")) print(f"{optname}={value}", file=stream) format_section = ini_format_section def rest_format_section(stream, section, options, encoding=None, doc=None): """format an options section using as ReST formatted output""" encoding = _get_encoding(encoding, stream) if section: print("%s\n%s" % (section, "'" * len(section)), file=stream) if doc: print(str(normalize_text(doc, line_len=79, indent="")), file=stream) print(file=stream) for optname, optdict, value in options: help = optdict.get("help") print(f":{optname}:", file=stream) if help: help = normalize_text(help, line_len=79, indent=" ") print(str(help), file=stream) if value: value = str(format_option_value(optdict, value)) print(file=stream) print(f" Default: ``{value.replace('`` ', '```` ``')}``", file=stream) # Options Manager ############################################################## class OptionsManagerMixIn: """MixIn to handle a configuration from both a configuration file and command line options """ def __init__( self, usage: Optional[str], config_file: Optional[Any] = None, version: Optional[Any] = None, quiet: int = 0, ) -> None: self.config_file = config_file self.reset_parsers(usage, version=version) # list of registered options providers self.options_providers: List[ConfigurationMixIn] = [] # dictionary associating option name to checker self._all_options: Dict[str, ConfigurationMixIn] = {} self._short_options: Dict[str, str] = {} self._nocallback_options: Dict[ConfigurationMixIn, str] = {} self._mygroups: Dict[str, optik_ext.OptionGroup] = {} # verbosity self.quiet = quiet self._maxlevel = 0 def reset_parsers(self, usage: Optional[str] = "", version: Optional[Any] = None) -> None: # configuration file parser self.cfgfile_parser = cp.ConfigParser() # command line parser self.cmdline_parser = optik_ext.OptionParser(usage=usage, version=version) # mypy: "OptionParser" has no attribute "options_manager" # dynamic attribute? self.cmdline_parser.options_manager = self # type: ignore self._optik_option_attrs = set(self.cmdline_parser.option_class.ATTRS) def register_options_provider( self, provider: "ConfigurationMixIn", own_group: bool = True ) -> None: """register an options provider""" assert provider.priority <= 0, "provider's priority can't be >= 0" for i in range(len(self.options_providers)): if provider.priority > self.options_providers[i].priority: self.options_providers.insert(i, provider) break else: self.options_providers.append(provider) # mypy: Need type annotation for 'option' # you can't type variable of a list comprehension, right? non_group_spec_options: List = [ option for option in provider.options if "group" not in option[1] # type: ignore ] # type: ignore groups = getattr(provider, "option_groups", ()) if own_group and non_group_spec_options: self.add_option_group( provider.name.upper(), provider.__doc__, non_group_spec_options, provider ) else: for opt, optdict in non_group_spec_options: self.add_optik_option(provider, self.cmdline_parser, opt, optdict) for gname, gdoc in groups: gname = gname.upper() # mypy: Need type annotation for 'option' # you can't type variable of a list comprehension, right? goptions: List = [ option for option in provider.options # type: ignore if option[1].get("group", "").upper() == gname ] # type: ignore self.add_option_group(gname, gdoc, goptions, provider) def add_option_group( self, group_name: str, doc: Optional[str], options: Union[List[Tuple[str, Dict[str, Any]]], List[Tuple[str, Dict[str, str]]]], provider: "ConfigurationMixIn", ) -> None: """add an option group including the listed options""" assert options # add option group to the command line parser if group_name in self._mygroups: group = self._mygroups[group_name] else: group = optik_ext.OptionGroup(self.cmdline_parser, title=group_name.capitalize()) self.cmdline_parser.add_option_group(group) # mypy: "OptionGroup" has no attribute "level" # dynamic attribute group.level = provider.level # type: ignore self._mygroups[group_name] = group # add section to the config file if group_name != "DEFAULT": self.cfgfile_parser.add_section(group_name) # add provider's specific options for opt, optdict in options: self.add_optik_option(provider, group, opt, optdict) def add_optik_option( self, provider: "ConfigurationMixIn", optikcontainer: Union[OptionParser, OptionGroup], opt: str, optdict: Dict[str, Any], ) -> None: if "inputlevel" in optdict: raise ValueError( "optdict dictionary argument shouldn't contain the key 'inputlevel', rename it to " f"'level' instead: {optdict}" ) args, optdict = self.optik_option(provider, opt, optdict) option = optikcontainer.add_option(*args, **optdict) self._all_options[opt] = provider self._maxlevel = max(self._maxlevel, option.level or 0) def optik_option( self, provider: "ConfigurationMixIn", opt: str, optdict: Dict[str, Any] ) -> Tuple[List[str], Dict[str, Any]]: """get our personal option definition and return a suitable form for use with optik/optparse """ optdict = copy(optdict) if "action" in optdict: self._nocallback_options[provider] = opt else: optdict["action"] = "callback" optdict["callback"] = self.cb_set_provider_option # default is handled here and *must not* be given to optik if you # want the whole machinery to work if "default" in optdict: if ( "help" in optdict and optdict.get("default") is not None and not optdict["action"] in ("store_true", "store_false") ): optdict["help"] += " [current: %default]" del optdict["default"] args = ["--" + str(opt)] if "short" in optdict: self._short_options[optdict["short"]] = opt args.append("-" + optdict["short"]) del optdict["short"] # cleanup option definition dict before giving it to optik for key in list(optdict.keys()): if key not in self._optik_option_attrs: optdict.pop(key) return args, optdict def cb_set_provider_option( self, option: "Option", opt: str, value: Union[List[str], int, str], parser: "OptionParser" ) -> None: """optik callback for option setting""" if opt.startswith("--"): # remove -- on long option opt = opt[2:] else: # short option, get its long equivalent opt = self._short_options[opt[1:]] # trick since we can't set action='store_true' on options if value is None: value = 1 self.global_set_option(opt, value) def global_set_option(self, opt: str, value: Union[List[str], int, str]) -> None: """set option on the correct option provider""" self._all_options[opt].set_option(opt, value) def generate_config( self, stream: Union[StringIO, TextIOWrapper] = None, skipsections: Tuple[()] = (), encoding: Optional[Any] = None, header_message: Optional[str] = None, ) -> None: """write a configuration file according to the current configuration into the given stream or stdout """ options_by_section: Dict[Any, List] = {} sections = [] for provider in self.options_providers: for section, options in provider.options_by_section(): if section is None: section = provider.name if section in skipsections: continue options = [(n, d, v) for (n, d, v) in options if d.get("type") is not None] if not options: continue if section not in sections: sections.append(section) alloptions = options_by_section.setdefault(section, []) alloptions += options stream = stream or sys.stdout encoding = _get_encoding(encoding, stream) printed = False if header_message is not None: # Make sure all the lines in header_message begin with '# ' # This is done by matching all the lines that do not start # with '#' (possibly with whitespaces) and prefix them by '# ' exp = re.compile(r"^\s*(?=[^#])", re.MULTILINE) commented_header = exp.sub("# ", header_message) print(commented_header, file=stream) for section in sections: if printed: print("\n", file=stream) format_section(stream, section.upper(), options_by_section[section], encoding) printed = True def generate_manpage( self, pkginfo: attrdict, section: int = 1, stream: StringIO = None ) -> None: """write a man page for the current configuration into the given stream or stdout """ self._monkeypatch_expand_default() try: optik_ext.generate_manpage( self.cmdline_parser, pkginfo, section, stream=stream or sys.stdout, level=self._maxlevel, ) finally: self._unmonkeypatch_expand_default() # initialization methods ################################################## def load_provider_defaults(self) -> None: """initialize configuration using default values""" for provider in self.options_providers: provider.load_defaults() def load_file_configuration(self, config_file: str = None) -> None: """load the configuration from file""" self.read_config_file(config_file) self.load_config_file() def read_config_file(self, config_file: str = None) -> None: """read the configuration file but do not load it (i.e. dispatching values to each options provider) """ helplevel = 1 while helplevel <= self._maxlevel: opt = "-".join(["long"] * helplevel) + "-help" if opt in self._all_options: break # already processed def helpfunc(option, opt, val, p, level=helplevel): print(self.help(level)) sys.exit(0) helpmsg = f"{' '.join(['more'] * helplevel)} verbose help." optdict = {"action": "callback", "callback": helpfunc, "help": helpmsg} provider = self.options_providers[0] self.add_optik_option(provider, self.cmdline_parser, opt, optdict) provider.options += ((opt, optdict),) helplevel += 1 if config_file is None: config_file = self.config_file if config_file is not None: config_file = expanduser(config_file) if config_file and exists(config_file): parser = self.cfgfile_parser parser.read([config_file]) # normalize sections'title # mypy: "ConfigParser" has no attribute "_sections" # dynamic attribute? for sect, values in list(parser._sections.items()): # type: ignore if not sect.isupper() and values: parser._sections[sect.upper()] = values # type: ignore elif not self.quiet: msg = "No config file found, using default configuration" print(msg, file=sys.stderr) return def input_config(self, onlysection=None, inputlevel=0, stream=None): """interactively get configuration values by asking to the user and generate a configuration file """ if onlysection is not None: onlysection = onlysection.upper() for provider in self.options_providers: for section, option, optdict in provider.all_options(): if onlysection is not None and section != onlysection: continue if "type" not in optdict: # ignore action without type (callback, store_true...) continue provider.input_option(option, optdict, inputlevel) # now we can generate the configuration file if stream is not None: self.generate_config(stream) def load_config_file(self) -> None: """dispatch values previously read from a configuration file to each options provider) """ parser = self.cfgfile_parser for section in parser.sections(): for option, value in parser.items(section): try: self.global_set_option(option, value) except (KeyError, OptionError): # TODO handle here undeclared options appearing in the config file continue def load_configuration(self, **kwargs: Any) -> None: """override configuration according to given parameters""" for opt, opt_value in kwargs.items(): opt = opt.replace("_", "-") provider = self._all_options[opt] provider.set_option(opt, opt_value) def load_command_line_configuration(self, args: List[str] = None) -> List[str]: """override configuration according to command line parameters return additional arguments """ self._monkeypatch_expand_default() try: if args is None: args = sys.argv[1:] else: args = list(args) (options, args) = self.cmdline_parser.parse_args(args=args) for provider in self._nocallback_options.keys(): config = provider.config for attr in config.__dict__.keys(): value = getattr(options, attr, None) if value is None: continue setattr(config, attr, value) return args finally: self._unmonkeypatch_expand_default() # help methods ############################################################ def add_help_section(self, title: str, description: str, level: int = 0) -> None: """add a dummy option section for help purpose""" group = optik_ext.OptionGroup( self.cmdline_parser, title=title.capitalize(), description=description ) # mypy: "OptionGroup" has no attribute "level" # it does, it is set in the optik_ext module group.level = level # type: ignore self._maxlevel = max(self._maxlevel, level) self.cmdline_parser.add_option_group(group) def _monkeypatch_expand_default(self) -> None: # monkey patch optik_ext to deal with our default values try: self.__expand_default_backup = optik_ext.HelpFormatter.expand_default # mypy: Cannot assign to a method # it's dirty but you can optik_ext.HelpFormatter.expand_default = expand_default # type: ignore except AttributeError: # python < 2.4: nothing to be done pass def _unmonkeypatch_expand_default(self) -> None: # remove monkey patch if hasattr(optik_ext.HelpFormatter, "expand_default"): # mypy: Cannot assign to a method # it's dirty but you can # unpatch optik_ext to avoid side effects optik_ext.HelpFormatter.expand_default = self.__expand_default_backup # type: ignore def help(self, level: int = 0) -> str: """return the usage string for available options""" # mypy: "HelpFormatter" has no attribute "output_level" # set in optik_ext self.cmdline_parser.formatter.output_level = level # type: ignore self._monkeypatch_expand_default() try: return self.cmdline_parser.format_help() finally: self._unmonkeypatch_expand_default() class Method: """used to ease late binding of default method (so you can define options on the class using default methods on the configuration instance) """ def __init__(self, methname): self.method = methname self._inst = None def bind(self, instance: "Configuration") -> None: """bind the method to its instance""" if self._inst is None: self._inst = instance def __call__(self, *args: Any, **kwargs: Any) -> Dict[str, str]: assert self._inst, "unbound method" return getattr(self._inst, self.method)(*args, **kwargs) # Options Provider ############################################################# class OptionsProviderMixIn: """Mixin to provide options to an OptionsManager""" # those attributes should be overridden priority = -1 name = "default" options: Tuple = () level = 0 def __init__(self) -> None: self.config = optik_ext.Values() for option_tuple in self.options: try: option, optdict = option_tuple except ValueError: raise Exception(f"Bad option: {str(option_tuple)}") if isinstance(optdict.get("default"), Method): optdict["default"].bind(self) elif isinstance(optdict.get("callback"), Method): optdict["callback"].bind(self) self.load_defaults() def load_defaults(self) -> None: """initialize the provider using default values""" for opt, optdict in self.options: action = optdict.get("action") if action != "callback": # callback action have no default default = self.option_default(opt, optdict) if default is REQUIRED: continue self.set_option(opt, default, action, optdict) def option_default(self, opt, optdict=None): """return the default value for an option""" if optdict is None: optdict = self.get_option_def(opt) default = optdict.get("default") if callable(default): default = default() return default def option_attrname(self, opt, optdict=None): """get the config attribute corresponding to opt""" if optdict is None: optdict = self.get_option_def(opt) return optdict.get("dest", opt.replace("-", "_")) def option_value(self, opt): """get the current value for the given option""" return getattr(self.config, self.option_attrname(opt), None) def set_option(self, opt, value, action=None, optdict=None): """method called to set an option (registered in the options list)""" if optdict is None: optdict = self.get_option_def(opt) if value is not None: value = _validate(value, optdict, opt) if action is None: action = optdict.get("action", "store") if optdict.get("type") == "named": # XXX need specific handling optname = self.option_attrname(opt, optdict) currentvalue = getattr(self.config, optname, None) if currentvalue: currentvalue.update(value) value = currentvalue if action == "store": setattr(self.config, self.option_attrname(opt, optdict), value) elif action in ("store_true", "count"): setattr(self.config, self.option_attrname(opt, optdict), 0) elif action == "store_false": setattr(self.config, self.option_attrname(opt, optdict), 1) elif action == "append": opt = self.option_attrname(opt, optdict) _list = getattr(self.config, opt, None) if _list is None: if isinstance(value, (list, tuple)): _list = value elif value is not None: _list = [] _list.append(value) setattr(self.config, opt, _list) elif isinstance(_list, tuple): setattr(self.config, opt, _list + (value,)) else: _list.append(value) elif action == "callback": optdict["callback"](None, opt, value, None) else: raise UnsupportedAction(action) def input_option(self, option, optdict, inputlevel=99): default = self.option_default(option, optdict) if default is REQUIRED: defaultstr = "(required): " elif optdict.get("level", 0) > inputlevel: return elif optdict["type"] == "password" or default is None: defaultstr = ": " else: defaultstr = f"(default: {format_option_value(optdict, default)}): " print(f":{option}:") print(optdict.get("help") or option) inputfunc = INPUT_FUNCTIONS[optdict["type"]] value = inputfunc(optdict, defaultstr) while default is REQUIRED and not value: print("please specify a value") value = inputfunc(optdict, f"{option}: ") if value is None and default is not None: value = default self.set_option(option, value, optdict=optdict) def get_option_def(self, opt): """return the dictionary defining an option given it's name""" assert self.options for option in self.options: if option[0] == opt: return option[1] # mypy: Argument 2 to "OptionError" has incompatible type "str"; expected "Option" # seems to be working? similar_options = get_close_matches(opt, (str(x[0]) for x in self.options), n=10) if similar_options: raise OptionError( "no such option %s in section %r.\n\nOptions with a similar name are:\n * %s" % (opt, self.name, "\n * ".join(similar_options)), opt, ) # type: ignore else: raise OptionError( "no such option %s in section %r.\n\nNo option with a similar name, " "all available options:\n * %s" % (opt, self.name, "\n * ".join(str(x[0]) for x in self.options)), opt, ) # type: ignore def all_options(self): """return an iterator on available options for this provider option are actually described by a 3-uple: (section, option name, option dictionary) """ for section, options in self.options_by_section(): if section is None: if self.name is None: continue section = self.name.upper() for option, optiondict, value in options: yield section, option, optiondict def options_by_section(self) -> Iterator[Any]: """return an iterator on options grouped by section (section, [list of (optname, optdict, optvalue)]) """ sections: Dict[str, List[Tuple[str, Dict[str, Any], Any]]] = {} for optname, optdict in self.options: sections.setdefault(optdict.get("group"), []).append( (optname, optdict, self.option_value(optname)) ) if None in sections: # mypy: No overload variant of "pop" of "MutableMapping" matches argument type "None" # it actually works yield None, sections.pop(None) # type: ignore for section, options in sorted(sections.items()): yield section.upper(), options def options_and_values(self, options=None): if options is None: options = self.options for optname, optdict in options: yield (optname, optdict, self.option_value(optname)) # configuration ################################################################ class ConfigurationMixIn(OptionsManagerMixIn, OptionsProviderMixIn): """basic mixin for simple configurations which don't need the manager / providers model """ def __init__(self, *args: Any, **kwargs: Any) -> None: if not args: kwargs.setdefault("usage", "") kwargs.setdefault("quiet", 1) OptionsManagerMixIn.__init__(self, *args, **kwargs) OptionsProviderMixIn.__init__(self) if not getattr(self, "option_groups", None): self.option_groups: List[Tuple[Any, str]] = [] for option, optdict in self.options: try: gdef = (optdict["group"].upper(), "") except KeyError: continue if gdef not in self.option_groups: self.option_groups.append(gdef) self.register_options_provider(self, own_group=False) def register_options(self, options): """add some options to the configuration""" options_by_group = {} for optname, optdict in options: options_by_group.setdefault(optdict.get("group", self.name.upper()), []).append( (optname, optdict) ) for group, group_options in options_by_group.items(): self.add_option_group(group, None, group_options, self) self.options += tuple(options) def load_defaults(self): OptionsProviderMixIn.load_defaults(self) def __iter__(self): return iter(self.config.__dict__.items()) def __getitem__(self, key): try: return getattr(self.config, self.option_attrname(key)) except (optik_ext.OptionValueError, AttributeError): raise KeyError(key) def __setitem__(self, key, value): self.set_option(key, value) def get(self, key, default=None): try: return self[key] except (OptionError, KeyError): return default class Configuration(ConfigurationMixIn): """class for simple configurations which don't need the manager / providers model and prefer delegation to inheritance configuration values are accessible through a dict like interface """ def __init__( self, config_file=None, options=None, name=None, usage=None, doc=None, version=None ): if options is not None: self.options = options if name is not None: self.name = name if doc is not None: self.__doc__ = doc super(Configuration, self).__init__(config_file=config_file, usage=usage, version=version) class OptionsManager2ConfigurationAdapter: """Adapt an option manager to behave like a `logilab.common.configuration.Configuration` instance """ def __init__(self, provider): self.config = provider def __getattr__(self, key): return getattr(self.config, key) def __getitem__(self, key): provider = self.config._all_options[key] try: return getattr(provider.config, provider.option_attrname(key)) except AttributeError: raise KeyError(key) def __setitem__(self, key, value): self.config.global_set_option(self.config.option_attrname(key), value) def get(self, key, default=None): try: return self[key] except KeyError: return default # other functions ############################################################## def read_old_config(newconfig, changes, configfile): """initialize newconfig from a deprecated configuration file possible changes: * ('renamed', oldname, newname) * ('moved', option, oldgroup, newgroup) * ('typechanged', option, oldtype, newvalue) """ # build an index of changes changesindex = {} for action in changes: if action[0] == "moved": option, oldgroup, newgroup = action[1:] changesindex.setdefault(option, []).append((action[0], oldgroup, newgroup)) continue if action[0] == "renamed": oldname, newname = action[1:] changesindex.setdefault(newname, []).append((action[0], oldname)) continue if action[0] == "typechanged": option, oldtype, newvalue = action[1:] changesindex.setdefault(option, []).append((action[0], oldtype, newvalue)) continue if action[0] in ("added", "removed"): continue # nothing to do here raise Exception(f"unknown change {action[0]}") # build a config object able to read the old config options = [] for optname, optdef in newconfig.options: for action in changesindex.pop(optname, ()): if action[0] == "moved": oldgroup, newgroup = action[1:] optdef = optdef.copy() optdef["group"] = oldgroup elif action[0] == "renamed": optname = action[1] elif action[0] == "typechanged": oldtype = action[1] optdef = optdef.copy() optdef["type"] = oldtype options.append((optname, optdef)) if changesindex: raise Exception(f"unapplied changes: {changesindex}") oldconfig = Configuration(options=options, name=newconfig.name) # read the old config oldconfig.load_file_configuration(configfile) # apply values reverting changes changes.reverse() done = set() for action in changes: if action[0] == "renamed": oldname, newname = action[1:] newconfig[newname] = oldconfig[oldname] done.add(newname) elif action[0] == "typechanged": optname, oldtype, newvalue = action[1:] newconfig[optname] = newvalue done.add(optname) for optname, optdef in newconfig.options: if optdef.get("type") and optname not in done: newconfig.set_option(optname, oldconfig[optname], optdict=optdef) def merge_options(options, optgroup=None): """preprocess a list of options and remove duplicates, returning a new list (tuple actually) of options. Options dictionaries are copied to avoid later side-effect. Also, if `otpgroup` argument is specified, ensure all options are in the given group. """ alloptions = {} options = list(options) for i in range(len(options) - 1, -1, -1): optname, optdict = options[i] if optname in alloptions: options.pop(i) alloptions[optname].update(optdict) else: optdict = optdict.copy() options[i] = (optname, optdict) alloptions[optname] = optdict if optgroup is not None: alloptions[optname]["group"] = optgroup return tuple(options) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/daemon.py0000666000000000000000000000631314762603732017770 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """A daemonize function (for Unices)""" __docformat__ = "restructuredtext en" import os import errno import warnings def setugid(user): """Change process user and group ID Argument is a numeric user id or a user name""" try: from pwd import getpwuid passwd = getpwuid(int(user)) except ValueError: from pwd import getpwnam passwd = getpwnam(user) if hasattr(os, "initgroups"): # python >= 2.7 os.initgroups(passwd.pw_name, passwd.pw_gid) else: import ctypes if ctypes.CDLL(None).initgroups(passwd.pw_name, passwd.pw_gid) < 0: err = ctypes.c_int.in_dll(ctypes.pythonapi, "errno").value raise OSError(err, os.strerror(err), "initgroups") os.setgid(passwd.pw_gid) os.setuid(passwd.pw_uid) os.environ["HOME"] = passwd.pw_dir def daemonize(pidfile=None, uid=None, umask=0o77): """daemonize a Unix process. Set paranoid umask by default. Return 1 in the original process, 2 in the first fork, and None for the second fork (eg daemon process). """ # http://www.faqs.org/faqs/unix-faq/programmer/faq/ # # fork so the parent can exit if os.fork(): # launch child and... return 1 # disconnect from tty and create a new session os.setsid() # fork again so the parent, (the session group leader), can exit. # as a non-session group leader, we can never regain a controlling # terminal. if os.fork(): # launch child again. return 2 # move to the root to avoit mount pb os.chdir("/") # redirect standard descriptors null = os.open("/dev/null", os.O_RDWR) for i in range(3): try: os.dup2(null, i) except OSError as e: if e.errno != errno.EBADF: raise os.close(null) # filter warnings warnings.filterwarnings("ignore") # write pid in a file if pidfile: # ensure the directory where the pid-file should be set exists (for # instance /var/run/cubicweb may be deleted on computer restart) piddir = os.path.dirname(pidfile) if not os.path.exists(piddir): os.makedirs(piddir) f = open(pidfile, "w") f.write(str(os.getpid())) f.close() # set umask if specified if umask is not None: os.umask(umask) # change process uid if uid: setugid(uid) return None ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/date.py0000666000000000000000000002722214762603732017444 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Date manipulation helper functions.""" __docformat__ = "restructuredtext en" import math import re import sys from locale import getlocale, LC_TIME from datetime import date, time, datetime, timedelta from time import strptime as time_strptime from calendar import monthrange, timegm from typing import Union, List, Any, Optional, Generator try: from mx.DateTime import RelativeDateTime, Date, DateTimeType except ImportError: endOfMonth = None DateTimeType = datetime else: endOfMonth = RelativeDateTime(months=1, day=-1) # NOTE: should we implement a compatibility layer between date representations # as we have in lgc.db ? FRENCH_FIXED_HOLIDAYS = { "jour_an": "%s-01-01", "fete_travail": "%s-05-01", "armistice1945": "%s-05-08", "fete_nat": "%s-07-14", "assomption": "%s-08-15", "toussaint": "%s-11-01", "armistice1918": "%s-11-11", "noel": "%s-12-25", } FRENCH_MOBILE_HOLIDAYS = { "paques2004": "2004-04-12", "ascension2004": "2004-05-20", "pentecote2004": "2004-05-31", "paques2005": "2005-03-28", "ascension2005": "2005-05-05", "pentecote2005": "2005-05-16", "paques2006": "2006-04-17", "ascension2006": "2006-05-25", "pentecote2006": "2006-06-05", "paques2007": "2007-04-09", "ascension2007": "2007-05-17", "pentecote2007": "2007-05-28", "paques2008": "2008-03-24", "ascension2008": "2008-05-01", "pentecote2008": "2008-05-12", "paques2009": "2009-04-13", "ascension2009": "2009-05-21", "pentecote2009": "2009-06-01", "paques2010": "2010-04-05", "ascension2010": "2010-05-13", "pentecote2010": "2010-05-24", "paques2011": "2011-04-25", "ascension2011": "2011-06-02", "pentecote2011": "2011-06-13", "paques2012": "2012-04-09", "ascension2012": "2012-05-17", "pentecote2012": "2012-05-28", } # XXX this implementation cries for multimethod dispatching def get_step(dateobj: Union[date, datetime], nbdays: int = 1) -> timedelta: # assume date is either a python datetime or a mx.DateTime object if isinstance(dateobj, date): return ONEDAY * nbdays return nbdays # mx.DateTime is ok with integers def datefactory( year: int, month: int, day: int, sampledate: Union[date, datetime] ) -> Union[date, datetime]: # assume date is either a python datetime or a mx.DateTime object if isinstance(sampledate, datetime): return datetime(year, month, day) if isinstance(sampledate, date): return date(year, month, day) return Date(year, month, day) def weekday(dateobj: Union[date, datetime]) -> int: # assume date is either a python datetime or a mx.DateTime object if isinstance(dateobj, date): return dateobj.weekday() return dateobj.day_of_week def str2date(datestr: str, sampledate: Union[date, datetime]) -> Union[date, datetime]: # NOTE: datetime.strptime is not an option until we drop py2.4 compat year, month, day = [int(chunk) for chunk in datestr.split("-")] return datefactory(year, month, day, sampledate) def days_between(start: Union[date, datetime], end: Union[date, datetime]) -> int: if isinstance(start, date): # mypy: No overload variant of "__sub__" of "datetime" matches argument type "date" # we ensure that end is a date assert isinstance(end, date) delta = end - start # type: ignore # datetime.timedelta.days is always an integer (floored) if delta.seconds: return delta.days + 1 return delta.days else: return int(math.ceil((end - start).days)) def get_national_holidays( begin: Union[date, datetime], end: Union[date, datetime] ) -> Union[List[date], List[datetime]]: """return french national days off between begin and end""" begin = datefactory(begin.year, begin.month, begin.day, begin) end = datefactory(end.year, end.month, end.day, end) holidays = [str2date(datestr, begin) for datestr in FRENCH_MOBILE_HOLIDAYS.values()] for year in range(begin.year, end.year + 1): for datestr in FRENCH_FIXED_HOLIDAYS.values(): date = str2date(datestr % year, begin) if date not in holidays: holidays.append(date) return [day for day in holidays if begin <= day < end] def add_days_worked(start: date, days: int) -> date: """adds date but try to only take days worked into account""" step = get_step(start) weeks, plus = divmod(days, 5) end = start + ((weeks * 7) + plus) * step if weekday(end) >= 5: # saturday or sunday end += 2 * step end += len([x for x in get_national_holidays(start, end + step) if weekday(x) < 5]) * step if weekday(end) >= 5: # saturday or sunday end += 2 * step return end def nb_open_days(start: Union[date, datetime], end: Union[date, datetime]) -> int: assert start <= end step = get_step(start) days = days_between(start, end) weeks, plus = divmod(days, 7) if weekday(start) > weekday(end): plus -= 2 elif weekday(end) == 6: plus -= 1 open_days = weeks * 5 + plus nb_week_holidays = len( [x for x in get_national_holidays(start, end + step) if weekday(x) < 5 and x < end] ) open_days -= nb_week_holidays if open_days < 0: return 0 return open_days def date_range( begin: date, end: date, incday: Optional[Any] = None, incmonth: Optional[bool] = None ) -> Generator[date, Any, None]: """yields each date between begin and end :param begin: the start date :param end: the end date :param incr: the step to use to iterate over dates. Default is one day. :param include: None (means no exclusion) or a function taking a date as parameter, and returning True if the date should be included. When using mx datetime, you should *NOT* use incmonth argument, use instead oneDay, oneHour, oneMinute, oneSecond, oneWeek or endOfMonth (to enumerate months) as `incday` argument """ assert not (incday and incmonth) begin = todate(begin) end = todate(end) if incmonth: while begin < end: yield begin begin = next_month(begin, incmonth) else: incr = get_step(begin, incday or 1) while begin < end: yield begin begin += incr # makes py datetime usable ##################################################### ONEDAY: timedelta = timedelta(days=1) ONEWEEK: timedelta = timedelta(days=7) def strptime_time(value, format="%H:%M"): return time(*time_strptime(value, format)[3:6]) def todate(somedate: date) -> date: """return a date from a date (leaving unchanged) or a datetime""" if isinstance(somedate, datetime): return date(somedate.year, somedate.month, somedate.day) assert isinstance(somedate, (date, DateTimeType)), repr(somedate) return somedate def totime(somedate): """return a time from a time (leaving unchanged), date or datetime""" # XXX mx compat if not isinstance(somedate, time): return time(somedate.hour, somedate.minute, somedate.second) assert isinstance(somedate, (time)), repr(somedate) return somedate def todatetime(somedate): """return a date from a date (leaving unchanged) or a datetime""" # take care, datetime is a subclass of date if isinstance(somedate, datetime): return somedate assert isinstance(somedate, (date, DateTimeType)), repr(somedate) return datetime(somedate.year, somedate.month, somedate.day) def datetime2ticks(somedate: Union[date, datetime]) -> int: return timegm(somedate.timetuple()) * 1000 + int(getattr(somedate, "microsecond", 0) / 1000) def ticks2datetime(ticks: int) -> datetime: miliseconds, microseconds = divmod(ticks, 1000) try: return datetime.fromtimestamp(miliseconds) except (ValueError, OverflowError): epoch = datetime.fromtimestamp(0) nb_days, seconds = divmod(int(miliseconds), 86400) delta = timedelta(nb_days, seconds=seconds, microseconds=microseconds) try: return epoch + delta except (ValueError, OverflowError): raise def days_in_month(somedate: date) -> int: return monthrange(somedate.year, somedate.month)[1] def days_in_year(somedate): feb = date(somedate.year, 2, 1) if days_in_month(feb) == 29: return 366 else: return 365 def previous_month(somedate, nbmonth=1): while nbmonth: somedate = first_day(somedate) - ONEDAY nbmonth -= 1 return somedate def next_month(somedate: date, nbmonth: int = 1) -> date: while nbmonth: somedate = last_day(somedate) + ONEDAY nbmonth -= 1 return somedate def first_day(somedate): return date(somedate.year, somedate.month, 1) def last_day(somedate: date) -> date: return date(somedate.year, somedate.month, days_in_month(somedate)) def ustrftime(somedate: datetime, fmt: str = "%Y-%m-%d") -> str: """like strftime, but returns a unicode string instead of an encoded string which may be problematic with localized date. """ if sys.version_info >= (3, 3): # datetime.date.strftime() supports dates since year 1 in Python >=3.3. return somedate.strftime(fmt) else: try: if sys.version_info < (3, 0): encoding = getlocale(LC_TIME)[1] or "ascii" return str(somedate.strftime(str(fmt)), encoding) else: return somedate.strftime(fmt) except ValueError: if somedate.year >= 1900: raise # datetime is not happy with dates before 1900 # we try to work around this, assuming a simple # format string fields = { "Y": somedate.year, "m": somedate.month, "d": somedate.day, } if isinstance(somedate, datetime): fields.update({"H": somedate.hour, "M": somedate.minute, "S": somedate.second}) fmt = re.sub("%([YmdHMS])", r"%(\1)02d", fmt) return str(fmt) % fields def utcdatetime(dt: datetime) -> datetime: if dt.tzinfo is None: return dt # mypy: No overload variant of "__sub__" of "datetime" matches argument type "None" return dt.replace(tzinfo=None) - dt.utcoffset() # type: ignore def utctime(dt): if dt.tzinfo is None: return dt return (dt + dt.utcoffset() + dt.dst()).replace(tzinfo=None) def datetime_to_seconds(date): """return the number of seconds since the begining of the day for that date""" return date.second + 60 * date.minute + 3600 * date.hour def timedelta_to_days(delta): """return the time delta as a number of seconds""" return delta.days + delta.seconds / (3600 * 24) def timedelta_to_seconds(delta): """return the time delta as a fraction of days""" return delta.days * (3600 * 24) + delta.seconds ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/debugger.py0000666000000000000000000001572714762603732020322 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Customized version of pdb's default debugger. - sets up a history file - uses ipython if available to colorize lines of code - overrides list command to search for current block instead of using 5 lines of context """ __docformat__ = "restructuredtext en" try: import readline except ImportError: # mypy: Incompatible types in assignment (expression has type "None", # mypy: variable has type Module)) # conditional import readline = None # type: ignore import os import sys from pdb import Pdb import inspect from io import StringIO try: from IPython import PyColorize except ImportError: def colorize(source, start_lineno, curlineno): """fallback colorize function""" return source def colorize_source(source): return source else: def colorize(source, start_lineno, curlineno): """colorize and annotate source with linenos (as in pdb's list command) """ parser = PyColorize.Parser() output = StringIO() parser.format(source, output) annotated = [] for index, line in enumerate(output.getvalue().splitlines()): lineno = index + start_lineno if lineno == curlineno: annotated.append(f"{lineno:>4}\t->\t{line}") else: annotated.append(f"{lineno:>4}\t\t{line}") return "\n".join(annotated) def colorize_source(source): """colorize given source""" parser = PyColorize.Parser() output = StringIO() parser.format(source, output) return output.getvalue() def getsource(obj): """Return the text of the source code for an object. The argument may be a module, class, method, function, traceback, frame, or code object. The source code is returned as a single string. An IOError is raised if the source code cannot be retrieved.""" lines, lnum = inspect.getsourcelines(obj) return "".join(lines), lnum ################################################################ class Debugger(Pdb): """custom debugger - sets up a history file - uses ipython if available to colorize lines of code - overrides list command to search for current block instead of using 5 lines of context """ def __init__(self, tcbk=None): Pdb.__init__(self) self.reset() if tcbk: while tcbk.tb_next is not None: tcbk = tcbk.tb_next self._tcbk = tcbk self._histfile = os.path.expanduser("~/.pdbhist") def setup_history_file(self): """if readline is available, read pdb history file""" if readline is not None: try: # XXX try..except shouldn't be necessary # read_history_file() can accept None readline.read_history_file(self._histfile) except OSError: pass def start(self): """starts the interactive mode""" self.interaction(self._tcbk.tb_frame, self._tcbk) def setup(self, frame, tcbk): """setup hook: set up history file""" self.setup_history_file() Pdb.setup(self, frame, tcbk) def set_quit(self): """quit hook: save commands in the history file""" if readline is not None: readline.write_history_file(self._histfile) Pdb.set_quit(self) def complete_p(self, text, line, begin_idx, end_idx): """provide variable names completion for the ``p`` command""" namespace = dict(self.curframe.f_globals) namespace.update(self.curframe.f_locals) if "." in text: return self.attr_matches(text, namespace) return [varname for varname in namespace if varname.startswith(text)] def attr_matches(self, text, namespace): """implementation coming from rlcompleter.Completer.attr_matches Compute matches when text contains a dot. Assuming the text is of the form NAME.NAME....[NAME], and is evaluatable in self.namespace, it will be evaluated and its attributes (as revealed by dir()) are used as possible completions. (For class instances, class members are also considered.) WARNING: this can still invoke arbitrary C code, if an object with a __getattr__ hook is evaluated. """ import re m = re.match(r"(\w+(\.\w+)*)\.(\w*)", text) if not m: return expr, attr = m.group(1, 3) object = eval(expr, namespace) words = dir(object) if hasattr(object, "__class__"): words.append("__class__") words = words + self.get_class_members(object.__class__) matches = [] n = len(attr) for word in words: if word[:n] == attr and word != "__builtins__": matches.append(f"{expr}.{word}") return matches def get_class_members(self, klass): """implementation coming from rlcompleter.get_class_members""" ret = dir(klass) if hasattr(klass, "__bases__"): for base in klass.__bases__: ret = ret + self.get_class_members(base) return ret # specific / overridden commands def do_list(self, arg): """overrides default list command to display the surrounding block instead of 5 lines of context """ self.lastcmd = "list" if not arg: try: source, start_lineno = getsource(self.curframe) print(colorize("".join(source), start_lineno, self.curframe.f_lineno)) except KeyboardInterrupt: pass except OSError: Pdb.do_list(self, arg) else: Pdb.do_list(self, arg) do_l = do_list def do_open(self, arg): """opens source file corresponding to the current stack level""" filename = self.curframe.f_code.co_filename lineno = self.curframe.f_lineno cmd = f"emacsclient --no-wait +{lineno} {filename}" os.system(cmd) do_o = do_open def pm(): """use our custom debugger""" dbg = Debugger(sys.last_traceback) dbg.start() def set_trace(): Debugger().set_trace(sys._getframe().f_back) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/decorators.py0000666000000000000000000002277314762603732020702 0ustar00rootroot# copyright 2003-2013 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """A few useful function/method decorators.""" __docformat__ = "restructuredtext en" import os import sys from time import process_time, time from inspect import isgeneratorfunction from typing import Any, Optional, Callable, overload, TypeVar from inspect import getfullargspec from logilab.common.compat import method_type # XXX rewrite so we can use the decorator syntax when keyarg has to be specified class cached_decorator: def __init__(self, cacheattr: Optional[str] = None, keyarg: Optional[int] = None) -> None: self.cacheattr = cacheattr self.keyarg = keyarg def __call__(self, callableobj: Optional[Callable] = None) -> Callable: assert not isgeneratorfunction( callableobj ), f"cannot cache generator function: {callableobj}" assert callableobj is not None if len(getfullargspec(callableobj).args) == 1 or self.keyarg == 0: cache = _SingleValueCache(callableobj, self.cacheattr) elif self.keyarg: cache = _MultiValuesKeyArgCache(callableobj, self.keyarg, self.cacheattr) else: cache = _MultiValuesCache(callableobj, self.cacheattr) return cache.closure() class _SingleValueCache: def __init__(self, callableobj: Callable, cacheattr: Optional[str] = None) -> None: self.callable = callableobj if cacheattr is None: self.cacheattr = f"_{callableobj.__name__}_cache_" else: assert cacheattr != callableobj.__name__ self.cacheattr = cacheattr def __call__(__me, self, *args): try: return self.__dict__[__me.cacheattr] except KeyError: value = __me.callable(self, *args) setattr(self, __me.cacheattr, value) return value def closure(self) -> Callable: def wrapped(*args, **kwargs): return self.__call__(*args, **kwargs) # mypy: "Callable[[VarArg(Any), KwArg(Any)], Any]" has no attribute "cache_obj" # dynamic attribute for magic wrapped.cache_obj = self # type: ignore try: wrapped.__doc__ = self.callable.__doc__ wrapped.__name__ = self.callable.__name__ except Exception: pass return wrapped def clear(self, holder): holder.__dict__.pop(self.cacheattr, None) class _MultiValuesCache(_SingleValueCache): def _get_cache(self, holder): try: _cache = holder.__dict__[self.cacheattr] except KeyError: _cache = {} setattr(holder, self.cacheattr, _cache) return _cache def __call__(__me, self, *args, **kwargs): _cache = __me._get_cache(self) try: return _cache[args] except KeyError: _cache[args] = __me.callable(self, *args) return _cache[args] class _MultiValuesKeyArgCache(_MultiValuesCache): def __init__(self, callableobj: Callable, keyarg: int, cacheattr: Optional[str] = None) -> None: super(_MultiValuesKeyArgCache, self).__init__(callableobj, cacheattr) self.keyarg = keyarg def __call__(__me, self, *args, **kwargs): _cache = __me._get_cache(self) key = args[__me.keyarg - 1] try: return _cache[key] except KeyError: _cache[key] = __me.callable(self, *args, **kwargs) return _cache[key] _T = TypeVar("_T", bound=Callable) @overload def cached( callableobj: None = None, keyarg: Optional[int] = None, **kwargs: Any ) -> Callable[[_T], _T]: ... @overload def cached(callableobj: _T = None, keyarg: Optional[int] = None, **kwargs: Any) -> _T: ... def cached(callableobj=None, keyarg=None, **kwargs): """Simple decorator to cache result of method call.""" kwargs["keyarg"] = keyarg decorator = cached_decorator(**kwargs) if callableobj is None: return decorator else: return decorator(callableobj) class cachedproperty: """Provides a cached property equivalent to the stacking of @cached and @property, but more efficient. After first usage, the becomes part of the object's __dict__. Doing: del obj. empties the cache. Idea taken from the pyramid_ framework and the mercurial_ project. .. _pyramid: http://pypi.python.org/pypi/pyramid .. _mercurial: http://pypi.python.org/pypi/Mercurial """ __slots__ = ("wrapped",) def __init__(self, wrapped): try: wrapped.__name__ except AttributeError: raise TypeError(f"{wrapped} must have a __name__ attribute") self.wrapped = wrapped # otherwise this breaks sphinx static analysis for __doc__ if os.path.basename(sys.argv[0]) != "sphinx-build": # mypy: Signature of "__doc__" incompatible with supertype "object" # but this works? @property def __doc__(self) -> str: # type: ignore doc = getattr(self.wrapped, "__doc__", None) return "%s" % ("\n%s" % doc if doc else "") def __get__(self, inst, objtype=None): if inst is None: return self val = self.wrapped(inst) setattr(inst, self.wrapped.__name__, val) return val def get_cache_impl(obj, funcname): cls = obj.__class__ member = getattr(cls, funcname) if isinstance(member, property): member = member.fget return member.cache_obj def clear_cache(obj, funcname): """Clear a cache handled by the :func:`cached` decorator. If 'x' class has @cached on its method `foo`, type >>> clear_cache(x, 'foo') to purge this method's cache on the instance. """ get_cache_impl(obj, funcname).clear(obj) def copy_cache(obj, funcname, cacheobj): """Copy cache for from cacheobj to obj.""" cacheattr = get_cache_impl(obj, funcname).cacheattr try: setattr(obj, cacheattr, cacheobj.__dict__[cacheattr]) except KeyError: pass class wproperty: """Simple descriptor expecting to take a modifier function as first argument and looking for a _ to retrieve the attribute. """ def __init__(self, setfunc): self.setfunc = setfunc self.attrname = f"_{setfunc.__name__}" def __set__(self, obj, value): self.setfunc(obj, value) def __get__(self, obj, cls): assert obj is not None return getattr(obj, self.attrname) class classproperty: """this is a simple property-like class but for class attributes.""" def __init__(self, get): self.get = get def __get__(self, inst, cls): return self.get(cls) class iclassmethod: """Descriptor for method which should be available as class method if called on the class or instance method if called on an instance. """ def __init__(self, func): self.func = func def __get__(self, instance, objtype): if instance is None: return method_type(self.func, objtype, objtype.__class__) return method_type(self.func, instance, objtype) def __set__(self, instance, value): raise AttributeError("can't set attribute") def timed(f): def wrap(*args, **kwargs): t = time() c = process_time() res = f(*args, **kwargs) print(f"{f.__name__} clock: {process_time() - c:.9f} / time: {time() - t:.9f}") return res return wrap def locked(acquire, release): """Decorator taking two methods to acquire/release a lock as argument, returning a decorator function which will call the inner method after having called acquire(self) et will call release(self) afterwards. """ def decorator(f): def wrapper(self: Any, *args: Any, **kwargs: Any) -> Any: acquire(self) try: return f(self, *args, **kwargs) finally: release(self) return wrapper return decorator def monkeypatch(klass: type, methodname: Optional[str] = None) -> Callable: """Decorator extending class with the decorated callable. This is basically a syntactic sugar vs class assignment. >>> class A: ... pass >>> @monkeypatch(A) ... def meth(self): ... return 12 ... >>> a = A() >>> a.meth() 12 >>> @monkeypatch(A, 'foo') ... def meth(self): ... return 12 ... >>> a.foo() 12 """ def decorator(func): try: name = methodname or func.__name__ except AttributeError: raise AttributeError( "%s has no __name__ attribute: " "you should provide an explicit `methodname`" % func ) setattr(klass, name, func) return func return decorator ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/deprecation.py0000666000000000000000000006424014762603732021025 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Deprecation utilities.""" __docformat__ = "restructuredtext en" import os import sys import inspect from enum import Enum from warnings import warn from functools import WRAPPER_ASSIGNMENTS, WRAPPER_UPDATES from importlib import import_module from typing import Any, Callable, Dict, Optional, Type from typing_extensions import Protocol if sys.version_info >= (3, 8): from importlib import metadata as importlib_metadata else: import importlib_metadata class FakeDistribution(importlib_metadata.Distribution): "see https://github.com/python/importlib_metadata/blob/main/CHANGES.rst#v600" def locate_file(self): pass def read_text(self): pass def _unstack_all_deprecation_decorators(function): """ This is another super edge magic case which is needed because we uses lazy_wraps because of logilab.common.modutils.LazyObject and because __name__ has special behavior and doesn't work like a normal attribute and that __getattribute__ of lazy_wraps is bypassed. Therefor, to get the real callable name when several lazy_wrapped decorator are used we need to travers the __wrapped__ attributes chain. """ while hasattr(function, "__wrapped__"): function = function.__wrapped__ return function def get_real__name__(some_callable: Callable) -> str: return _unstack_all_deprecation_decorators(some_callable).__name__ def get_real__module__(some_callable: Callable) -> str: return _unstack_all_deprecation_decorators(some_callable).__module__ def lazy_wraps(wrapped: Callable) -> Callable: """ This is the equivalent of the @wraps decorator of functools except it won't try to grabs attributes of the targeted function on decoration but on access. This is needed because of logilab.common.modutils.LazyObject. Indeed: if you try to decorate a LazyObject with @wraps, wraps will try to access attributes of LazyObject and this will trigger the attempt to import the module decorated by LazyObject which you don't want to do when you just want to mark this LazyObject has been a deprecated objet that you only wants to trigger if the user try to use it. Usage: like @wraps() >>> @lazy_wraps(function) >>> def wrapper(*args, **kwargs): ... """ def update_wrapper_attributes(wrapper: Callable) -> Callable: def __getattribute__(self, attribute: str) -> Any: if attribute in WRAPPER_ASSIGNMENTS: return getattr(wrapped, attribute) return super(self.__class__, self).__getattribute__(attribute) wrapper.__getattribute__ = __getattribute__ # type: ignore for attribute in WRAPPER_UPDATES: getattr(wrapper, attribute).update(getattr(wrapped, attribute, {})) wrapper.__wrapped__ = wrapped # type: ignore return wrapper return update_wrapper_attributes class DeprecationWrapper: """proxy to print a warning on access to any attribute of the wrapped object""" def __init__( self, proxied: Any, msg: Optional[str] = None, version: Optional[str] = None ) -> None: self._proxied: Any = proxied self._msg: str = msg if msg else "" self.version: Optional[str] = version def __getattr__(self, attr: str) -> Any: send_warning( self._msg, deprecation_class=DeprecationWarning, deprecation_class_kwargs={}, stacklevel=3, version=self.version, ) return getattr(self._proxied, attr) def __setattr__(self, attr: str, value: Any) -> None: if attr in ("_proxied", "_msg"): self.__dict__[attr] = value else: send_warning( self._msg, deprecation_class=DeprecationWarning, deprecation_class_kwargs={}, stacklevel=3, version=self.version, ) setattr(self._proxied, attr, value) def _get_module_name(number: int = 1) -> str: """ automagically try to determine the package name from which the warning has been triggered by loop other calling frames. If it fails to do so, return an empty string. """ frame = sys._getframe() for i in range(number + 1): if frame.f_back is None: break frame = frame.f_back if frame.f_globals["__package__"]: return frame.f_globals["__package__"] file_name = os.path.split(frame.f_globals["__file__"])[1] if file_name.endswith(".py"): file_name = file_name[: -len(".py")] return file_name _cached_path_to_package: Optional[Dict[str, Optional[str]]] = None def _get_package_name(python_object) -> Optional[str]: # only do this work if we are in a pytest session if "COLLECT_DEPRECATION_WARNINGS_PACKAGE_NAME" not in os.environ: return None global _cached_path_to_package if _cached_path_to_package is None: _cached_path_to_package = {} # mypy fails to understand the result of .discover(): Cannot # instantiate abstract class 'Distribution' with abstract attributes # 'locate_file' and 'read_text' for distribution in FakeDistribution().discover(): # type: ignore # sometime distribution has a "name" attribute, sometime not if distribution.files and hasattr(distribution, "name"): for file in distribution.files: _cached_path_to_package[str(distribution.locate_file(file))] = distribution.name continue if distribution.files and "name" in distribution.metadata: for file in distribution.files: _cached_path_to_package[str(distribution.locate_file(file))] = ( distribution.metadata["name"] ) try: return _cached_path_to_package.get( inspect.getfile(_unstack_all_deprecation_decorators(python_object)) ) except TypeError: return None def send_warning( reason: str, deprecation_class: Type[DeprecationWarning], deprecation_class_kwargs: Dict[str, Any], version: Optional[str] = None, stacklevel: int = 2, module_name: Optional[str] = None, ) -> None: """Display a deprecation message only if the version is older than the compatible version. """ if module_name and version: reason = f"[{module_name} {version}] {reason}" elif module_name: reason = f"[{module_name}] {reason}" elif version: reason = f"[{version}] {reason}" warn( deprecation_class(reason, **deprecation_class_kwargs), stacklevel=stacklevel # type: ignore ) class DeprecationWarningKind(Enum): ARGUMENT = "argument" ATTRIBUTE = "attribute" CALLABLE = "callable" CLASS = "class" MODULE = "module" class DeprecationWarningOperation(Enum): DEPRECATED = "deprecated" MOVED = "moved" REMOVED = "removed" RENAMED = "renamed" class StructuredDeprecationWarning(DeprecationWarning): """ Base class for all structured DeprecationWarning Mostly used with isinstance """ def __init__(self, reason: str, package: str = None, version: str = None): self.reason: str = reason self.package = package self.version = version def __str__(self) -> str: return self.reason class TargetRenamedDeprecationWarning(StructuredDeprecationWarning): def __init__( self, reason: str, kind: DeprecationWarningKind, old_name: str, new_name: str, package: str = None, version: str = None, ): super().__init__(reason, package=package, version=version) self.operation = DeprecationWarningOperation.RENAMED self.kind: DeprecationWarningKind = kind # callable, class, module, argument, attribute self.old_name: str = old_name self.new_name: str = new_name class TargetDeprecatedDeprecationWarning(StructuredDeprecationWarning): def __init__( self, reason: str, kind: DeprecationWarningKind, package: str = None, version: str = None ): super().__init__(reason, package=package, version=version) self.operation = DeprecationWarningOperation.DEPRECATED self.kind: DeprecationWarningKind = kind # callable, class, module, argument, attribute class TargetRemovedDeprecationWarning(StructuredDeprecationWarning): def __init__( self, reason: str, kind: DeprecationWarningKind, name: str, package: str = None, version: str = None, ): super().__init__(reason, package=package, version=version) self.operation = DeprecationWarningOperation.REMOVED self.kind: DeprecationWarningKind = kind # callable, class, module, argument, attribute self.name: str = name class TargetMovedDeprecationWarning(StructuredDeprecationWarning): def __init__( self, reason: str, kind: DeprecationWarningKind, old_name: str, new_name: str, old_module: str, new_module: str, package: str = None, version: str = None, ): super().__init__(reason, package=package, version=version) self.operation = DeprecationWarningOperation.MOVED self.kind: DeprecationWarningKind = kind # callable, class, module, argument, attribute self.old_name: str = old_name self.new_name: str = new_name self.old_module: str = old_module self.new_module: str = new_module def callable_renamed( old_name: str, new_function: Callable, version: Optional[str] = None ) -> Callable: """use to tell that a callable has been renamed. It returns a callable wrapper, so that when its called a warning is printed telling what is the object new name. >>> old_function = renamed('old_function', new_function) >>> old_function() sample.py:57: DeprecationWarning: old_function has been renamed and is deprecated, uses new_function instead old_function() >>> """ @lazy_wraps(new_function) def wrapped(*args, **kwargs): send_warning( ( f"{old_name} has been renamed and is deprecated, uses " f"{get_real__name__(new_function)} instead" ), TargetRenamedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.CALLABLE, "old_name": old_name, "new_name": get_real__name__(new_function), "version": version, "package": _get_package_name(new_function), }, stacklevel=3, version=version, module_name=get_real__module__(new_function), ) return new_function(*args, **kwargs) return wrapped def argument_removed(old_argument_name: str, version: Optional[str] = None) -> Callable: """ callable decorator to allow getting backward compatibility for renamed keyword arguments. >>> @argument_removed("old") ... def some_function(new): ... return new >>> some_function(old=42) sample.py:15: DeprecationWarning: argument old of callable some_function has been renamed and is deprecated, use keyword argument new instead some_function(old=42) 42 """ def _wrap(func: Callable) -> Callable: @lazy_wraps(func) def check_kwargs(*args, **kwargs): if old_argument_name in kwargs: send_warning( f"argument {old_argument_name} of callable {get_real__name__(func)} has been " f"removed and is deprecated", deprecation_class=TargetRemovedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.ARGUMENT, "name": old_argument_name, "version": version, "package": _get_package_name(func), }, stacklevel=3, version=version, module_name=get_real__module__(func), ) del kwargs[old_argument_name] return func(*args, **kwargs) return check_kwargs return _wrap def callable_deprecated( reason: Optional[str] = None, version: Optional[str] = None, stacklevel: int = 2 ) -> Callable: """Display a deprecation message only if the version is older than the compatible version. """ def decorator(func: Callable) -> Callable: @lazy_wraps(func) def wrapped(*args, **kwargs) -> Callable: message: str = reason or 'The function "%s" is deprecated' if "%s" in message: message %= get_real__name__(func) send_warning( message, deprecation_class=TargetDeprecatedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.CALLABLE, "version": version, "package": _get_package_name(func), }, version=version, stacklevel=stacklevel + 1, module_name=get_real__module__(func), ) return func(*args, **kwargs) return wrapped return decorator class CallableDeprecatedCallable(Protocol): def __call__( self, reason: Optional[str] = None, version: Optional[str] = None, stacklevel: int = 2 ) -> Callable: ... def _generate_class_deprecated(): class _class_deprecated(type): """metaclass to print a warning on instantiation of a deprecated class""" def __call__(cls, *args, **kwargs): message = getattr(cls, "__deprecation_warning__", "%(cls)s is deprecated") % { "cls": get_real__name__(cls) } send_warning( message, deprecation_class=getattr( cls, "__deprecation_warning_class__", TargetDeprecatedDeprecationWarning ), deprecation_class_kwargs=getattr( cls, "__deprecation_warning_class_kwargs__", { "kind": DeprecationWarningKind.CLASS, "package": _get_package_name(cls), "version": getattr(cls, "__deprecation_warning_version__", None), }, ), module_name=getattr( cls, "__deprecation_warning_module_name__", _get_module_name(1) ), stacklevel=getattr(cls, "__deprecation_warning_stacklevel__", 3), version=getattr(cls, "__deprecation_warning_version__", None), ) return type.__call__(cls, *args, **kwargs) return _class_deprecated class_deprecated = _generate_class_deprecated() def attribute_renamed(old_name: str, new_name: str, version: Optional[str] = None) -> Callable: """ class decorator to allow getting backward compatibility for renamed attributes. >>> @attribute_renamed(old_name="old", new_name="new") ... class SomeClass: ... def __init__(self): ... self.new = 42 >>> some_class = SomeClass() >>> print(some_class.old) sample.py:15: DeprecationWarning: SomeClass.old has been renamed and is deprecated, use SomeClass.new instead print(some_class.old) 42 >>> some_class.old = 43 sample.py:16: DeprecationWarning: SomeClass.old has been renamed and is deprecated, use SomeClass.new instead some_class.old = 43 >>> some_class.old == some_class.new True """ def _class_wrap(klass: type) -> type: reason = ( f"{get_real__name__(klass)}.{old_name} has been renamed and is deprecated, use " f"{get_real__name__(klass)}.{new_name} instead" ) def _get_old(self) -> Any: send_warning( reason, deprecation_class=TargetRenamedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.ATTRIBUTE, "old_name": old_name, "new_name": new_name, "version": version, "package": _get_package_name(klass), }, stacklevel=3, version=version, module_name=get_real__module__(klass), ) return getattr(self, new_name) def _set_old(self, value) -> None: send_warning( reason, deprecation_class=TargetRenamedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.ATTRIBUTE, "old_name": old_name, "new_name": new_name, "version": version, "package": _get_package_name(klass), }, stacklevel=3, version=version, module_name=get_real__module__(klass), ) setattr(self, new_name, value) def _del_old(self): send_warning( reason, deprecation_class=TargetRenamedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.ATTRIBUTE, "old_name": old_name, "new_name": new_name, "version": version, "package": _get_package_name(klass), }, stacklevel=3, version=version, module_name=get_real__module__(klass), ) delattr(self, new_name) setattr(klass, old_name, property(_get_old, _set_old, _del_old)) return klass return _class_wrap def argument_renamed(old_name: str, new_name: str, version: Optional[str] = None) -> Callable: """ callable decorator to allow getting backward compatibility for renamed keyword arguments. >>> @argument_renamed(old_name="old", new_name="new") ... def some_function(new): ... return new >>> some_function(old=42) sample.py:15: DeprecationWarning: argument old of callable some_function has been renamed and is deprecated, use keyword argument new instead some_function(old=42) 42 """ def _wrap(func: Callable) -> Callable: @lazy_wraps(func) def check_kwargs(*args, **kwargs) -> Callable: if old_name in kwargs and new_name in kwargs: raise ValueError( f"argument {old_name} of callable {get_real__name__(func)} has been " f"renamed to {new_name} but you are both using {old_name} and " f"{new_name} has keyword arguments, only uses {new_name}" ) if old_name in kwargs: send_warning( f"argument {old_name} of callable {get_real__name__(func)} has been renamed " f"and is deprecated, use keyword argument {new_name} instead", deprecation_class=TargetRenamedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.ARGUMENT, "old_name": old_name, "new_name": new_name, "version": version, "package": _get_package_name(func), }, stacklevel=3, version=version, module_name=get_real__module__(func), ) kwargs[new_name] = kwargs[old_name] del kwargs[old_name] return func(*args, **kwargs) return check_kwargs return _wrap def callable_moved( module_name: str, object_name: str, version: Optional[str] = None, stacklevel: int = 2, new_name: Optional[str] = None, ) -> Callable: """use to tell that a callable has been moved to a new module. It returns a callable wrapper, so that when its called a warning is printed telling where the object can be found, import is done (and not before) and the actual object is called. NOTE: the usage is somewhat limited on classes since it will fail if the wrapper is use in a class ancestors list, use the `class_moved` function instead (which has no lazy import feature though). """ # in case the callable has been renamed new_name = new_name if new_name is not None else object_name old_module = _get_module_name(1) message = "object %s.%s has been moved to %s.%s" % ( old_module, object_name, module_name, object_name, ) def callnew(*args, **kwargs): m = import_module(module_name) send_warning( message, deprecation_class=TargetMovedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.CALLABLE, "old_name": object_name, "new_name": new_name, "old_module": old_module, "new_module": module_name, "version": version, "package": _get_package_name(getattr(m, object_name)), }, version=version, stacklevel=stacklevel + 1, module_name=old_module, ) return getattr(m, object_name)(*args, **kwargs) return callnew def class_renamed( old_name: str, new_class: type, message: Optional[str] = None, version: Optional[str] = None, module_name: Optional[str] = None, deprecated_warning_class=TargetRenamedDeprecationWarning, deprecated_warning_kwargs=None, ) -> type: """automatically creates a class which fires a DeprecationWarning when instantiated. >>> Set = class_renamed('Set', set, 'Set is now replaced by set') >>> s = Set() sample.py:57: DeprecationWarning: Set is now replaced by set s = Set() >>> """ class_dict: Dict[str, Any] = {} if message is None: message = f"{old_name} is deprecated, use {get_real__name__(new_class)} instead" class_dict["__deprecation_warning__"] = message class_dict["__deprecation_warning_class__"] = deprecated_warning_class if deprecated_warning_kwargs is None: class_dict["__deprecation_warning_class_kwargs__"] = { "kind": DeprecationWarningKind.CLASS, "old_name": old_name, "new_name": get_real__name__(new_class), "version": version, "package": _get_package_name(new_class), } else: class_dict["__deprecation_warning_class_kwargs__"] = deprecated_warning_kwargs class_dict["__deprecation_warning_version__"] = version class_dict["__deprecation_warning_stacklevel__"] = 3 if module_name: class_dict["__deprecation_warning_module_name__"] = module_name else: class_dict["__deprecation_warning_module_name__"] = _get_module_name(1) try: return class_deprecated(old_name, (new_class,), class_dict) except (NameError, TypeError): # in case of conflicting metaclass situation # mypy can't handle dynamic base classes https://github.com/python/mypy/issues/2477 class DeprecatedClass(new_class): # type: ignore def __init__(self, *args, **kwargs): msg = class_dict.get( "__deprecation_warning__", f"{old_name} is deprecated, use {get_real__name__(new_class)} instead", ) send_warning( msg, deprecation_class=TargetRenamedDeprecationWarning, deprecation_class_kwargs={ "kind": DeprecationWarningKind.CLASS, "old_name": old_name, "new_name": get_real__name__(new_class), "version": version, "package": _get_package_name(new_class), }, stacklevel=class_dict.get("__deprecation_warning_stacklevel__", 3), version=class_dict.get("__deprecation_warning_version__", None), ) super(DeprecatedClass, self).__init__(*args, **kwargs) return DeprecatedClass def class_moved( new_class: type, old_name: Optional[str] = None, message: Optional[str] = None, version: Optional[str] = None, ) -> type: """nice wrapper around class_renamed when a class has been moved into another module """ if old_name is None: old_name = get_real__name__(new_class) old_module = _get_module_name(1) if message is None: message = "class %s.%s is now available as %s.%s" % ( old_module, old_name, get_real__module__(new_class), get_real__name__(new_class), ) module_name = _get_module_name(1) return class_renamed( old_name, new_class, message=message, version=version, module_name=module_name, deprecated_warning_class=TargetMovedDeprecationWarning, deprecated_warning_kwargs={ "kind": DeprecationWarningKind.CLASS, "old_module": old_module, "new_module": get_real__module__(new_class), "old_name": old_name, "new_name": get_real__name__(new_class), "version": version, "package": _get_package_name(new_class), }, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/fileutils.py0000666000000000000000000002764414762603732020537 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """File and file-path manipulation utilities. :group path manipulation: first_level_directory, relative_path, is_binary,\ get_by_ext, remove_dead_links :group file manipulation: norm_read, norm_open, lines, stream_lines, lines,\ write_open_mode, ensure_fs_mode, export :sort: path manipulation, file manipulation """ __docformat__ = "restructuredtext en" import io import sys import shutil import mimetypes from os.path import isabs, isdir, islink, split, exists, normpath, join from os.path import abspath from os import sep, mkdir, remove, listdir, stat, chmod, walk from stat import ST_MODE, S_IWRITE from typing import Optional, List, Tuple from io import FileIO from _io import TextIOWrapper from logilab.common import STD_BLACKLIST as BASE_BLACKLIST, IGNORED_EXTENSIONS def first_level_directory(path: str) -> str: """Return the first level directory of a path. >>> first_level_directory('home/syt/work') 'home' >>> first_level_directory('/home/syt/work') '/' >>> first_level_directory('work') 'work' >>> :type path: str :param path: the path for which we want the first level directory :rtype: str :return: the first level directory appearing in `path` """ head, tail = split(path) while head and tail: head, tail = split(head) if tail: return tail # path was absolute, head is the fs root return head def abspath_listdir(path): """Lists path's content using absolute paths.""" path = abspath(path) return [join(path, filename) for filename in listdir(path)] def is_binary(filename: str) -> int: """Return true if filename may be a binary file, according to it's extension. :type filename: str :param filename: the name of the file :rtype: bool :return: true if the file is a binary file (actually if it's mime type isn't beginning by text/) """ try: # mypy: Item "None" of "Optional[str]" has no attribute "startswith" # it's handle by the exception return not mimetypes.guess_type(filename)[0].startswith("text") # type: ignore except AttributeError: return 1 def write_open_mode(filename: str) -> str: """Return the write mode that should used to open file. :type filename: str :param filename: the name of the file :rtype: str :return: the mode that should be use to open the file ('w' or 'wb') """ if is_binary(filename): return "wb" return "w" def ensure_fs_mode(filepath, desired_mode=S_IWRITE): """Check that the given file has the given mode(s) set, else try to set it. :type filepath: str :param filepath: path of the file :type desired_mode: int :param desired_mode: ORed flags describing the desired mode. Use constants from the `stat` module for file permission's modes """ mode = stat(filepath)[ST_MODE] if not mode & desired_mode: chmod(filepath, mode | desired_mode) # XXX (syt) unused? kill? class ProtectedFile(FileIO): """A special file-object class that automatically does a 'chmod +w' when needed. XXX: for now, the way it is done allows 'normal file-objects' to be created during the ProtectedFile object lifetime. One way to circumvent this would be to chmod / unchmod on each write operation. One other way would be to : - catch the IOError in the __init__ - if IOError, then create a StringIO object - each write operation writes in this StringIO object - on close()/del(), write/append the StringIO content to the file and do the chmod only once """ def __init__(self, filepath: str, mode: str) -> None: self.original_mode = stat(filepath)[ST_MODE] self.mode_changed = False if mode in ("w", "a", "wb", "ab"): if not self.original_mode & S_IWRITE: chmod(filepath, self.original_mode | S_IWRITE) self.mode_changed = True FileIO.__init__(self, filepath, mode) def _restore_mode(self) -> None: """restores the original mode if needed""" if self.mode_changed: chmod(self.name, self.original_mode) # Don't re-chmod in case of several restore self.mode_changed = False def close(self) -> None: """restore mode before closing""" self._restore_mode() FileIO.close(self) def __del__(self) -> None: if not self.closed: self.close() class UnresolvableError(Exception): """Exception raised by relative path when it's unable to compute relative path between two paths. """ def relative_path(from_file, to_file): """Try to get a relative path from `from_file` to `to_file` (path will be absolute if to_file is an absolute file). This function is useful to create link in `from_file` to `to_file`. This typical use case is used in this function description. If both files are relative, they're expected to be relative to the same directory. >>> relative_path( from_file='toto/index.html', to_file='index.html') '../index.html' >>> relative_path( from_file='index.html', to_file='toto/index.html') 'toto/index.html' >>> relative_path( from_file='tutu/index.html', to_file='toto/index.html') '../toto/index.html' >>> relative_path( from_file='toto/index.html', to_file='/index.html') '/index.html' >>> relative_path( from_file='/toto/index.html', to_file='/index.html') '../index.html' >>> relative_path( from_file='/toto/index.html', to_file='/toto/summary.html') 'summary.html' >>> relative_path( from_file='index.html', to_file='index.html') '' >>> relative_path( from_file='/index.html', to_file='toto/index.html') Traceback (most recent call last): File "", line 1, in ? File "", line 37, in relative_path UnresolvableError >>> relative_path( from_file='/index.html', to_file='/index.html') '' >>> :type from_file: str :param from_file: source file (where links will be inserted) :type to_file: str :param to_file: target file (on which links point) :raise UnresolvableError: if it has been unable to guess a correct path :rtype: str :return: the relative path of `to_file` from `from_file` """ from_file = normpath(from_file) to_file = normpath(to_file) if from_file == to_file: return "" if isabs(to_file): if not isabs(from_file): return to_file elif isabs(from_file): raise UnresolvableError() from_parts = from_file.split(sep) to_parts = to_file.split(sep) idem = 1 result = [] while len(from_parts) > 1: dirname = from_parts.pop(0) if idem and len(to_parts) > 1 and dirname == to_parts[0]: to_parts.pop(0) else: idem = 0 result.append("..") result += to_parts return sep.join(result) def lines(path: str, comments: Optional[str] = None) -> List[str]: """Return a list of non empty lines in the file located at `path`. :type path: str :param path: path to the file :type comments: str or None :param comments: optional string which can be used to comment a line in the file (i.e. lines starting with this string won't be returned) :rtype: list :return: a list of stripped line in the file, without empty and commented lines :warning: at some point this function will probably return an iterator """ with io.open(path) as stream: return stream_lines(stream, comments) def stream_lines(stream: TextIOWrapper, comments: Optional[str] = None) -> List[str]: """Return a list of non empty lines in the given `stream`. :type stream: object implementing 'xreadlines' or 'readlines' :param stream: file like object :type comments: str or None :param comments: optional string which can be used to comment a line in the file (i.e. lines starting with this string won't be returned) :rtype: list :return: a list of stripped line in the file, without empty and commented lines :warning: at some point this function will probably return an iterator """ try: readlines = stream.xreadlines except AttributeError: readlines = stream.readlines result = [] for line in readlines(): line = line.strip() if line and (comments is None or not line.startswith(comments)): result.append(line) return result def export( from_dir: str, to_dir: str, blacklist: Tuple[str, str, str, str, str, str, str, str] = BASE_BLACKLIST, ignore_ext: Tuple[str, str, str, str, str, str] = IGNORED_EXTENSIONS, verbose: int = 0, ) -> None: """Make a mirror of `from_dir` in `to_dir`, omitting directories and files listed in the black list or ending with one of the given extensions. :type from_dir: str :param from_dir: directory to export :type to_dir: str :param to_dir: destination directory :type blacklist: list or tuple :param blacklist: list of files or directories to ignore, default to the content of `BASE_BLACKLIST` :type ignore_ext: list or tuple :param ignore_ext: list of extensions to ignore, default to the content of `IGNORED_EXTENSIONS` :type verbose: bool :param verbose: flag indicating whether information about exported files should be printed to stderr, default to False """ try: mkdir(to_dir) except OSError: pass # FIXME we should use "exists" if the point is about existing dir # else (permission problems?) shouldn't return / raise ? for directory, dirnames, filenames in walk(from_dir): for norecurs in blacklist: try: dirnames.remove(norecurs) except ValueError: continue for dirname in dirnames: src = join(directory, dirname) dest = to_dir + src[len(from_dir) :] if isdir(src): if not exists(dest): mkdir(dest) for filename in filenames: # don't include binary files # endswith does not accept tuple in 2.4 if any([filename.endswith(ext) for ext in ignore_ext]): continue src = join(directory, filename) dest = to_dir + src[len(from_dir) :] if verbose: print(src, "->", dest, file=sys.stderr) if exists(dest): remove(dest) shutil.copy2(src, dest) def remove_dead_links(directory, verbose=0): """Recursively traverse directory and remove all dead links. :type directory: str :param directory: directory to cleanup :type verbose: bool :param verbose: flag indicating whether information about deleted links should be printed to stderr, default to False """ for dirpath, dirnames, filenames in walk(directory): for filename in dirnames + filenames: src = join(dirpath, filename) if islink(src) and not exists(src): if verbose: print("remove dead link", src) remove(src) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/graph.py0000666000000000000000000002575514762603732017641 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Graph manipulation utilities. (dot generation adapted from pypy/translator/tool/make_dot.py) """ __docformat__ = "restructuredtext en" __metaclass__ = type import os.path as osp import os import sys import tempfile import codecs import errno from typing import Dict, List, Tuple, Optional, Set, TypeVar, Iterable def escape(value): """Make usable in a dot file.""" lines = [line.replace('"', '\\"') for line in value.split("\n")] data = "\\l".join(lines) return "\\n" + data def target_info_from_filename(filename): """Transforms /some/path/foo.png into ('/some/path', 'foo.png', 'png').""" basename = osp.basename(filename) storedir = osp.dirname(osp.abspath(filename)) target = filename.split(".")[-1] return storedir, basename, target class DotBackend: """Dot File backend.""" def __init__( self, graphname, rankdir=None, size=None, ratio=None, charset="utf-8", renderer="dot", additionnal_param={}, ): self.graphname = graphname self.renderer = renderer self.lines = [] self._source = None self.emit("digraph %s {" % normalize_node_id(graphname)) if rankdir: self.emit(f"rankdir={rankdir}") if ratio: self.emit(f"ratio={ratio}") if size: self.emit(f'size="{size}"') if charset: assert charset.lower() in ( "utf-8", "iso-8859-1", "latin1", ), f"unsupported charset {charset}" self.emit(f'charset="{charset}"') for param in sorted(additionnal_param.items()): self.emit("=".join(param)) def get_source(self): """returns self._source""" if self._source is None: self.emit("}\n") self._source = "\n".join(self.lines) del self.lines return self._source source = property(get_source) def generate(self, outputfile=None, dotfile=None, mapfile=None): """Generates a graph file. :param outputfile: filename and path [defaults to graphname.png] :param dotfile: filename and path [defaults to graphname.dot] :rtype: str :return: a path to the generated file """ import subprocess # introduced in py 2.4 name = self.graphname if not dotfile: # if 'outputfile' is a dot file use it as 'dotfile' if outputfile and outputfile.endswith(".dot"): dotfile = outputfile else: dotfile = f"{name}.dot" if outputfile is not None: storedir, basename, target = target_info_from_filename(outputfile) if target != "dot": pdot, dot_sourcepath = tempfile.mkstemp(".dot", name) os.close(pdot) else: dot_sourcepath = osp.join(storedir, dotfile) else: target = "png" pdot, dot_sourcepath = tempfile.mkstemp(".dot", name) ppng, outputfile = tempfile.mkstemp(".png", name) os.close(pdot) os.close(ppng) pdot = codecs.open(dot_sourcepath, "w", encoding="utf8") pdot.write(self.source) pdot.close() if target != "dot": if sys.platform == "win32": use_shell = True else: use_shell = False try: if mapfile: subprocess.call( [ self.renderer, "-Tcmapx", "-o", mapfile, "-T", target, dot_sourcepath, "-o", outputfile, ], shell=use_shell, ) else: subprocess.call( [self.renderer, "-T", target, dot_sourcepath, "-o", outputfile], shell=use_shell, ) except OSError as e: if e.errno == errno.ENOENT: e.strerror = f"File not found: {self.renderer}" raise os.unlink(dot_sourcepath) return outputfile def emit(self, line): """Adds to final output.""" self.lines.append(line) def emit_edge(self, name1, name2, **props): """emit an edge from to . edge properties: see http://www.graphviz.org/doc/info/attrs.html """ attrs = [f'{prop}="{value}"' for prop, value in props.items()] n_from, n_to = normalize_node_id(name1), normalize_node_id(name2) self.emit(f"{n_from} -> {n_to} [{', '.join(sorted(attrs))}];") def emit_node(self, name, **props): """emit a node with given properties. node properties: see http://www.graphviz.org/doc/info/attrs.html """ attrs = [f'{prop}="{value}"' for prop, value in props.items()] self.emit(f"{normalize_node_id(name)} [{', '.join(sorted(attrs))}];") def normalize_node_id(nid): """Returns a suitable DOT node id for `nid`.""" return f'"{nid}"' class GraphGenerator: def __init__(self, backend): # the backend is responsible to output the graph in a particular format self.backend = backend # XXX doesn't like space in outpufile / mapfile def generate(self, visitor, propshdlr, outputfile=None, mapfile=None): # the visitor # the property handler is used to get node and edge properties # according to the graph and to the backend self.propshdlr = propshdlr for nodeid, node in visitor.nodes(): props = propshdlr.node_properties(node) self.backend.emit_node(nodeid, **props) for subjnode, objnode, edge in visitor.edges(): props = propshdlr.edge_properties(edge, subjnode, objnode) self.backend.emit_edge(subjnode, objnode, **props) return self.backend.generate(outputfile=outputfile, mapfile=mapfile) class UnorderableGraph(Exception): pass V = TypeVar("V") _Graph = Dict[V, List[V]] def ordered_nodes(graph: _Graph) -> Tuple[V, ...]: """takes a dependency graph dict as arguments and return an ordered tuple of nodes starting with nodes without dependencies and up to the outermost node. If there is some cycle in the graph, :exc:`UnorderableGraph` will be raised. Also the given graph dict will be emptied. """ # check graph consistency cycles: List[List[V]] = get_cycles(graph) if cycles: bad_cycles = "\n".join([" -> ".join(map(str, cycle)) for cycle in cycles]) raise UnorderableGraph(f"cycles in graph: {bad_cycles}") vertices = set(graph) to_vertices = set() for edges in graph.values(): to_vertices |= set(edges) missing_vertices = to_vertices - vertices if missing_vertices: raise UnorderableGraph(f"missing vertices: {', '.join(missing_vertices)}") # order vertices order = [] order_set = set() old_len = None while graph: if old_len == len(graph): raise UnorderableGraph(f"unknown problem with {graph}") old_len = len(graph) deps_ok = [] for node, node_deps in graph.items(): for dep in node_deps: if dep not in order_set: break else: deps_ok.append(node) order.append(deps_ok) order_set |= set(deps_ok) for node in deps_ok: del graph[node] result = [] for grp in reversed(order): result.extend(sorted(grp)) return tuple(result) def get_cycles(graph_dict: _Graph, vertices: Optional[Iterable] = None) -> List[List]: """given a dictionary representing an ordered graph (i.e. key are vertices and values is a list of destination vertices representing edges), return a list of detected cycles """ if not graph_dict: return [] result: List[List] = [] if vertices is None: vertices = graph_dict.keys() for vertice in vertices: _get_cycles(graph_dict, [], set(), result, vertice) return result def _get_cycles( graph_dict: _Graph, path: List, visited: Set, result: List[List], vertice: V ) -> None: """recursive function doing the real work for get_cycles""" if vertice in path: cycle = [vertice] for node in path[::-1]: if node == vertice: break cycle.insert(0, node) # make a canonical representation # mypy: error: Value of type variable "_LT" of "min" cannot be "V" # XXX I have no idea what this exactly mean here... we probably need to # XXX tell that "V" supports "lower than" type of comparison but how? start_from = min(cycle) # type: ignore index = cycle.index(start_from) cycle = cycle[index:] + cycle[0:index] # append it to result if not already in if cycle not in result: result.append(cycle) return path.append(vertice) try: for node in graph_dict[vertice]: # don't check already visited nodes again if node not in visited: _get_cycles(graph_dict, path, visited, result, node) visited.add(node) except KeyError: pass path.pop() def has_path( graph_dict: Dict[str, List[str]], fromnode: str, tonode: str, path: Optional[List[str]] = None ) -> Optional[List[str]]: """generic function taking a simple graph definition as a dictionary, with node has key associated to a list of nodes directly reachable from it. Return None if no path exists to go from `fromnode` to `tonode`, else the first path found (as a list including the destination node at last) """ if path is None: path = [] elif fromnode in path: return None path.append(fromnode) for destnode in graph_dict[fromnode]: if destnode == tonode or has_path(graph_dict, destnode, tonode, path): return path[1:] + [tonode] path.pop() return None ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/interface.py0000666000000000000000000000511414762603732020463 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Bases class for interfaces to provide 'light' interface handling. TODO: _ implements a check method which check that an object implements the interface _ Attribute objects This module requires at least python 2.2 """ __docformat__ = "restructuredtext en" class Interface: """Base class for interfaces.""" @classmethod def is_implemented_by(cls, instance: type) -> bool: return implements(instance, cls) def implements(obj: type, interface: type) -> bool: """Return true if the give object (maybe an instance or class) implements the interface. """ kimplements = getattr(obj, "__implements__", ()) if not isinstance(kimplements, (list, tuple)): kimplements = (kimplements,) for implementedinterface in kimplements: if issubclass(implementedinterface, interface): return True return False def extend(klass: type, interface: type, _recurs: bool = False) -> None: """Add interface to klass'__implements__ if not already implemented in. If klass is subclassed, ensure subclasses __implements__ it as well. NOTE: klass should be e new class. """ if not implements(klass, interface): try: kimplements = klass.__implements__ # type: ignore kimplementsklass = type(kimplements) kimplements = list(kimplements) except AttributeError: kimplementsklass = tuple kimplements = [] kimplements.append(interface) klass.__implements__ = kimplementsklass(kimplements) # type: ignore for subklass in klass.__subclasses__(): extend(subklass, interface, _recurs=True) elif _recurs: for subklass in klass.__subclasses__(): extend(subklass, interface, _recurs=True) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/logging_ext.py0000666000000000000000000001505414762603732021035 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Extends the logging module from the standard library.""" __docformat__ = "restructuredtext en" import os import sys import logging from logilab.common.textutils import colorize_ansi def set_log_methods(cls, logger): """bind standard logger's methods as methods on the class""" cls.__logger = logger for attr in ("debug", "info", "warning", "error", "critical", "exception"): setattr(cls, attr, getattr(logger, attr)) def xxx_cyan(record): if "XXX" in record.message: return "cyan" class ColorFormatter(logging.Formatter): """ A color Formatter for the logging standard module. By default, colorize CRITICAL and ERROR in red, WARNING in orange, INFO in green and DEBUG in yellow. self.colors is customizable via the 'color' constructor argument (dictionary). self.colorfilters is a list of functions that get the LogRecord and return a color name or None. """ def __init__(self, fmt=None, datefmt=None, colors=None): logging.Formatter.__init__(self, fmt, datefmt) self.colorfilters = [] self.colors = { "CRITICAL": "background_red", "ERROR": "red", "WARNING": "yellow", "INFO": "white", "DEBUG": "cyan", } if colors is not None: assert isinstance(colors, dict) self.colors.update(colors) def format(self, record): msg = logging.Formatter.format(self, record) if record.levelname in self.colors: color = self.colors[record.levelname] return colorize_ansi(msg, color) else: for cf in self.colorfilters: color = cf(record) if color: return colorize_ansi(msg, color) return msg def set_color_formatter(logger=None, **kw): """ Install a color formatter on the 'logger'. If not given, it will defaults to the default logger. Any additional keyword will be passed as-is to the ColorFormatter constructor. """ if logger is None: logger = logging.getLogger() if not logger.handlers: logging.basicConfig() format_msg = logger.handlers[0].formatter._fmt fmt = ColorFormatter(format_msg, **kw) fmt.colorfilters.append(xxx_cyan) logger.handlers[0].setFormatter(fmt) LOG_FORMAT = "%(asctime)s - (%(name)s) %(levelname)s: %(message)s" LOG_DATE_FORMAT = "%Y-%m-%d %H:%M:%S" def get_handler(debug=False, syslog=False, logfile=None, rotation_parameters=None): """get an apropriate handler according to given parameters""" if os.environ.get("APYCOT_ROOT"): handler = logging.StreamHandler(sys.stdout) if debug: handler = logging.StreamHandler() elif logfile is None: if syslog: from logging import handlers handler = handlers.SysLogHandler() else: handler = logging.StreamHandler() else: try: if rotation_parameters is None: if os.name == "posix" and sys.version_info >= (2, 6): from logging.handlers import WatchedFileHandler handler = WatchedFileHandler(logfile) else: handler = logging.FileHandler(logfile) else: from logging.handlers import TimedRotatingFileHandler handler = TimedRotatingFileHandler(logfile, **rotation_parameters) except OSError: handler = logging.StreamHandler() return handler def get_threshold(debug=False, logthreshold=None): if logthreshold is None: if debug: logthreshold = logging.DEBUG else: logthreshold = logging.ERROR elif isinstance(logthreshold, str): logthreshold = getattr(logging, THRESHOLD_MAP.get(logthreshold, logthreshold)) return logthreshold def _colorable_terminal(): isatty = hasattr(sys.__stdout__, "isatty") and sys.__stdout__.isatty() if not isatty: return False if os.name == "nt": try: from colorama import init as init_win32_colors except ImportError: return False init_win32_colors() return True def get_formatter(logformat=LOG_FORMAT, logdateformat=LOG_DATE_FORMAT): if _colorable_terminal(): fmt = ColorFormatter(logformat, logdateformat) def col_fact(record): if "XXX" in record.message: return "cyan" if "kick" in record.message: return "red" fmt.colorfilters.append(col_fact) else: fmt = logging.Formatter(logformat, logdateformat) return fmt def init_log( debug=False, syslog=False, logthreshold=None, logfile=None, logformat=LOG_FORMAT, logdateformat=LOG_DATE_FORMAT, fmt=None, rotation_parameters=None, handler=None, ): """init the log service""" logger = logging.getLogger() if handler is None: handler = get_handler(debug, syslog, logfile, rotation_parameters) # only addHandler and removeHandler method while I would like a setHandler # method, so do it this way :$ logger.handlers = [handler] logthreshold = get_threshold(debug, logthreshold) logger.setLevel(logthreshold) if fmt is None: if debug: fmt = get_formatter(logformat=logformat, logdateformat=logdateformat) else: fmt = logging.Formatter(logformat, logdateformat) handler.setFormatter(fmt) return handler # map logilab.common.logger thresholds to logging thresholds THRESHOLD_MAP = { "LOG_DEBUG": "DEBUG", "LOG_INFO": "INFO", "LOG_NOTICE": "INFO", "LOG_WARN": "WARNING", "LOG_WARNING": "WARNING", "LOG_ERR": "ERROR", "LOG_ERROR": "ERROR", "LOG_CRIT": "CRITICAL", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/modutils.py0000666000000000000000000002233314762603732020365 0ustar00rootroot# copyright 2003-2013 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Python modules manipulation utility functions.""" __docformat__ = "restructuredtext en" import sys import os from os.path import ( join, abspath, exists, expanduser, normcase, realpath, ) from typing import Dict, List, Optional, Sequence from importlib import import_module from logilab.common import STD_BLACKLIST, _handle_blacklist from logilab.common.deprecation import callable_deprecated class LazyObject: """ This class allows to lazyly declare a object (most likely only a callable according to the code) from a module without importing it. The import will be triggered when the user tries to access attributes of the object/callable or call it. Trying to set or delete attributes of the wrapped object/callable will not works as expected. """ def __init__(self, module, obj): self.module = module self.obj = obj self._imported = None def _getobj(self): if self._imported is None: self._imported = getattr(import_module(self.module), self.obj) return self._imported def __getattribute__(self, attr): try: return super(LazyObject, self).__getattribute__(attr) except AttributeError: return getattr(self._getobj(), attr) def __call__(self, *args, **kwargs): return self._getobj()(*args, **kwargs) def _check_init(path: str, mod_path: List[str]) -> bool: """check there are some __init__.py all along the way""" def _has_init(directory: str) -> Optional[str]: """if the given directory has a valid __init__ file, return its path, else return None """ mod_or_pack = join(directory, "__init__") for ext in ("py", "pyc", "pyo"): if exists(mod_or_pack + "." + ext): return mod_or_pack + "." + ext return None def _has_dirs(directory: str) -> bool: for file in os.listdir(directory): if os.path.isdir(os.path.join(directory, path)): return True return False for part in mod_path: path = join(path, part) if not _has_init(path) and not _has_dirs(path): return False return True @callable_deprecated( "you should avoid using modpath_from_file(), it doesn't play well with symlinks and " "sys.meta_path and you should use python standard loaders" ) def modpath_from_file(filename: str, extrapath: Optional[Dict[str, str]] = None) -> List[str]: """DEPRECATED: doesn't play well with symlinks and sys.meta_path Given a file path return the corresponding splitted module's name (i.e name of a module or package splitted on '.') :type filename: str :param filename: file's path for which we want the module's name :type extrapath: dict :param extrapath: optional extra search path, with path as key and package name for the path as value. This is usually useful to handle package splitted in multiple directories using __path__ trick. :raise ImportError: if the corresponding module's name has not been found :rtype: list(str) :return: the corresponding splitted module's name """ def _canonicalize_path(path: str) -> str: return realpath(expanduser((path))) def _is_in_a_valid_module(directory_or_file: str) -> bool: """ Try to emulate a reverse version of the new rule of PEP 420 to determine if a file is in a valid module. https://peps.python.org/pep-0420/ To quote it: > During import processing, the import machinery will continue to > iterate over each directory in the parent path as it does in Python > 3.2. While looking for a module or package named “foo”, for each > directory in the parent path: > > * If /foo/__init__.py is found, a regular package is imported and returned. > * If not, but /foo.{py,pyc,so,pyd} is found, a module is imported and > returned. The exact list of extension varies by platform and whether the -O flag is > specified. The list here is representative. > * If not, but /foo is found and is a directory, it is recorded and the scan > continues with the next directory in the parent path. > * Otherwise the scan continues with the next directory in the parent path. > > If the scan completes without returning a module or package, and at least one > directory was recorded, then a namespace package is created. The new namespace > package: > > * Has a __path__ attribute set to an iterable of the path strings that were > found and recorded during the scan. > * Does not have a __file__ attribute. """ # XXX to quote documentation: The exact list of extension varies by # platform and whether the -O flag is specified. # So this code is not great at all python_extensions = ("py", "pyc", "pyo", "so", "pyd") directory = directory_or_file if not os.path.isdir(directory_or_file): # /foo.{py,pyc,so,pyd} situation if os.path.exists(directory_or_file) and directory_or_file.endswith( tuple("." + extension for extension in python_extensions) ): return True directory = os.path.split(directory_or_file)[0] mod_or_pack = join(directory, "__init__") for ext in python_extensions: # /foo/__init__.py case if exists(mod_or_pack + "." + ext): return True return False filename = _canonicalize_path(filename) base = os.path.splitext(filename)[0] if extrapath is not None: for path_ in map(_canonicalize_path, extrapath): path = abspath(path_) if path and normcase(base[: len(path)]) == normcase(path): if _is_in_a_valid_module(filename): submodpath = [pkg for pkg in base[len(path) :].split(os.sep) if pkg] return extrapath[path_].split(".") + submodpath for path in map(_canonicalize_path, sys.path): if path and normcase(base).startswith(path): modpath = [pkg for pkg in base[len(path) :].split(os.sep) if pkg] if _is_in_a_valid_module(filename): return modpath raise ImportError( "Unable to find module for %s in:\n* %s" % (filename, "\n* ".join(sys.path + list(extrapath.keys() if extrapath else []))) ) def get_module_files(src_directory: str, blacklist: Sequence[str] = STD_BLACKLIST) -> List[str]: """given a package directory return a list of all available python module's files in the package and its subpackages :type src_directory: str :param src_directory: path of the directory corresponding to the package :type blacklist: list or tuple :param blacklist: optional list of files or directory to ignore, default to the value of `logilab.common.STD_BLACKLIST` :rtype: list :return: the list of all available python module's files in the package and its subpackages """ files = [] for directory, dirnames, filenames in os.walk(src_directory): _handle_blacklist(blacklist, dirnames, filenames) # check for __init__.py if "__init__.py" not in filenames: dirnames[:] = () continue for filename in filenames: if filename.endswith((".py", ".so", ".pyd", ".pyw")): src = join(directory, filename) files.append(src) return files def cleanup_sys_modules(directories): """remove submodules of `directories` from `sys.modules`""" cleaned = [] for modname, module in list(sys.modules.items()): modfile = getattr(module, "__file__", None) if modfile: for directory in directories: if modfile.startswith(directory): cleaned.append(modname) del sys.modules[modname] break return cleaned def clean_sys_modules(names): """remove submodules starting with name from `names` from `sys.modules`""" cleaned = set() for modname in list(sys.modules): for name in names: if modname.startswith(name): del sys.modules[modname] cleaned.add(modname) break return cleaned ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/optik_ext.py0000666000000000000000000003740514762603732020541 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Add an abstraction level to transparently import optik classes from optparse (python >= 2.3) or the optik package. It also defines three new types for optik/optparse command line parser : * regexp argument of this type will be converted using re.compile * csv argument of this type will be converted using split(',') * yn argument of this type will be true if 'y' or 'yes', false if 'n' or 'no' * named argument of this type are in the form = or : * password argument of this type wont be converted but this is used by other tools such as interactive prompt for configuration to double check value and use an invisible field * multiple_choice same as default "choice" type but multiple choices allowed * file argument of this type wont be converted but checked that the given file exists * color argument of this type wont be converted but checked its either a named color or a color specified using hexadecimal notation (preceded by a #) * time argument of this type will be converted to a float value in seconds according to time units (ms, s, min, h, d) * bytes argument of this type will be converted to a float value in bytes according to byte units (b, kb, mb, gb, tb) """ __docformat__ = "restructuredtext en" import re import sys import time from copy import copy from os.path import exists from logilab.common import attrdict from typing import Any, Union, List, Optional, Tuple, Dict from _io import StringIO # python >= 2.3 from optparse import ( # noqa OptionParser as BaseParser, Option as BaseOption, OptionGroup, OptionContainer, OptionValueError, OptionError, Values, HelpFormatter, SUPPRESS_HELP, NO_DEFAULT, ) try: from mx import DateTime HAS_MX_DATETIME = True except ImportError: HAS_MX_DATETIME = False from logilab.common.textutils import splitstrip, TIME_UNITS, BYTE_UNITS, apply_units def check_regexp(option, opt, value): """check a regexp value by trying to compile it return the compiled regexp """ if hasattr(value, "pattern"): return value try: return re.compile(value) except ValueError: raise OptionValueError(f"option {opt}: invalid regexp value: {value!r}") def check_csv( option: Optional["Option"], opt: str, value: Union[List[str], Tuple[str, ...], str] ) -> Union[List[str], Tuple[str, ...]]: """check a csv value by trying to split it return the list of separated values """ if isinstance(value, (list, tuple)): return value try: return splitstrip(value) except ValueError: raise OptionValueError(f"option {opt}: invalid csv value: {value!r}") def check_yn(option: Optional["Option"], opt: str, value: Union[bool, str]) -> bool: """check a yn value return true for yes and false for no """ if isinstance(value, int): return bool(value) if value in ("y", "yes"): return True if value in ("n", "no"): return False msg = "option %s: invalid yn value %r, should be in (y, yes, n, no)" raise OptionValueError(msg % (opt, value)) def check_named( option: Optional[Any], opt: str, value: Union[Dict[str, str], str] ) -> Dict[str, str]: """check a named value return a dictionary containing (name, value) associations """ if isinstance(value, dict): return value values: List[Tuple[str, str]] = [] for value in check_csv(option, opt, value): # mypy: Argument 1 to "append" of "list" has incompatible type "List[str]"; # mypy: expected "Tuple[str, str]" # we know that the split will give a 2 items list if value.find("=") != -1: values.append(value.split("=", 1)) # type: ignore elif value.find(":") != -1: values.append(value.split(":", 1)) # type: ignore if values: return dict(values) msg = "option %s: invalid named value %r, should be = or \ :" raise OptionValueError(msg % (opt, value)) def check_password(option, opt, value): """check a password value (can't be empty)""" # no actual checking, monkey patch if you want more return value def check_file(option, opt, value): """check a file value return the filepath """ if exists(value): return value msg = "option %s: file %r does not exist" raise OptionValueError(msg % (opt, value)) # XXX use python datetime def check_date(option, opt, value): """check a file value return the filepath """ try: return DateTime.strptime(value, "%Y/%m/%d") except DateTime.Error: raise OptionValueError(f"expected format of {opt} is yyyy/mm/dd") def check_color(option, opt, value): """check a color value and returns it /!\\ does *not* check color labels (like 'red', 'green'), only checks hexadecimal forms """ # Case (1) : color label, we trust the end-user if re.match("[a-z0-9 ]+$", value, re.I): return value # Case (2) : only accepts hexadecimal forms if re.match("#[a-f0-9]{6}", value, re.I): return value # Else : not a color label neither a valid hexadecimal form => error msg = "option %s: invalid color : %r, should be either hexadecimal \ value or predefined color" raise OptionValueError(msg % (opt, value)) def check_time(option, opt, value): if isinstance(value, (int, float)): return value return apply_units(value, TIME_UNITS) def check_bytes(option: Optional["Option"], opt: str, value: Any) -> int: if hasattr(value, "__int__"): return value # mypy: Incompatible return value type (got "Union[float, int]", expected "int") # we force "int" using "final=int" return apply_units(value, BYTE_UNITS, final=int) # type: ignore class Option(BaseOption): """override optik.Option to add some new option types""" TYPES = BaseOption.TYPES + ( "regexp", "csv", "yn", "named", "password", "multiple_choice", "file", "color", "time", "bytes", ) ATTRS = BaseOption.ATTRS + ["hide", "level"] TYPE_CHECKER = copy(BaseOption.TYPE_CHECKER) TYPE_CHECKER["regexp"] = check_regexp TYPE_CHECKER["csv"] = check_csv TYPE_CHECKER["yn"] = check_yn TYPE_CHECKER["named"] = check_named TYPE_CHECKER["multiple_choice"] = check_csv TYPE_CHECKER["file"] = check_file TYPE_CHECKER["color"] = check_color TYPE_CHECKER["password"] = check_password TYPE_CHECKER["time"] = check_time TYPE_CHECKER["bytes"] = check_bytes if HAS_MX_DATETIME: TYPES += ("date",) TYPE_CHECKER["date"] = check_date def __init__(self, *opts: str, **attrs: Any) -> None: BaseOption.__init__(self, *opts, **attrs) # mypy: "Option" has no attribute "hide" # we test that in the if if hasattr(self, "hide") and self.hide: # type: ignore self.help = SUPPRESS_HELP def _check_choice(self) -> None: """FIXME: need to override this due to optik misdesign""" if self.type in ("choice", "multiple_choice"): # mypy: "Option" has no attribute "choices" # we know that option of this type has this attribute if self.choices is None: # type: ignore raise OptionError("must supply a list of choices for type 'choice'", self) elif not isinstance(self.choices, (tuple, list)): # type: ignore raise OptionError( "choices must be a list of strings ('%s' supplied)" % str(type(self.choices)).split("'")[1], # type: ignore self, ) elif self.choices is not None: # type: ignore raise OptionError(f"must not supply choices for type {self.type!r}", self) # mypy: Unsupported target for indexed assignment # black magic? BaseOption.CHECK_METHODS[2] = _check_choice # type: ignore def process(self, opt: str, value: str, values: Values, parser: BaseParser) -> int: # First, convert the value(s) to the right type. Howl if any # value(s) are bogus. value = self.convert_value(opt, value) if self.type == "named": assert self.dest is not None existant = getattr(values, self.dest) if existant: existant.update(value) value = existant # And then take whatever action is expected of us. # This is a separate method to make life easier for # subclasses to add new actions. # mypy: Argument 2 to "take_action" of "Option" has incompatible type "Optional[str]"; # mypy: expected "str" # is it ok? return self.take_action(self.action, self.dest, opt, value, values, parser) # type: ignore class OptionParser(BaseParser): """override optik.OptionParser to use our Option class""" def __init__(self, option_class: type = Option, *args: Any, **kwargs: Any) -> None: # mypy: Argument "option_class" to "__init__" of "OptionParser" has incompatible type # mypy: "type"; expected "Option" # mypy is doing really weird things with *args/**kwargs and looks buggy BaseParser.__init__(self, option_class=option_class, *args, **kwargs) # type: ignore def format_option_help(self, formatter: Optional[HelpFormatter] = None) -> str: if formatter is None: formatter = self.formatter outputlevel = getattr(formatter, "output_level", 0) formatter.store_option_strings(self) result = [] result.append(formatter.format_heading("Options")) formatter.indent() if self.option_list: result.append(OptionContainer.format_option_help(self, formatter)) result.append("\n") for group in self.option_groups: # mypy: "OptionParser" has no attribute "level" # but it has one no? if group.level <= outputlevel and ( # type: ignore group.description or level_options(group, outputlevel) ): result.append(group.format_help(formatter)) result.append("\n") formatter.dedent() # Drop the last "\n", or the header if no options or option groups: return "".join(result[:-1]) # mypy error: error: "Type[OptionGroup]" has no attribute "level" # monkeypatching OptionGroup.level = 0 # type: ignore def level_options(group: BaseParser, outputlevel: int) -> List[BaseOption]: # mypy: "Option" has no attribute "help" # but it does return [ option for option in group.option_list if (getattr(option, "level", 0) or 0) <= outputlevel and option.help is not SUPPRESS_HELP # type: ignore # noqa ] def format_option_help(self, formatter): result = [] outputlevel = getattr(formatter, "output_level", 0) or 0 for option in level_options(self, outputlevel): result.append(formatter.format_option(option)) return "".join(result) # mypy error: Cannot assign to a method # but we still do it because magic OptionContainer.format_option_help = format_option_help # type: ignore class ManHelpFormatter(HelpFormatter): """Format help using man pages ROFF format""" def __init__( self, indent_increment: int = 0, max_help_position: int = 24, width: int = 79, short_first: int = 0, ) -> None: HelpFormatter.__init__(self, indent_increment, max_help_position, width, short_first) def format_heading(self, heading: str) -> str: return f".SH {heading.upper()}\n" def format_description(self, description): return description def format_option(self, option: BaseParser) -> str: try: # mypy: "Option" has no attribute "option_strings" # we handle if it doesn't optstring = option.option_strings # type: ignore except AttributeError: optstring = self.format_option_strings(option) # mypy: "OptionParser" has no attribute "help" # it does if option.help: # type: ignore # mypy: Argument 1 to "expand_default" of "HelpFormatter" has incompatible type # mypy: "OptionParser"; expected "Option" # it still works? help_text = self.expand_default(option) # type: ignore help = " ".join([line.strip() for line in help_text.splitlines()]) else: help = "" return f""".IP "{optstring}" {help} """ def format_head(self, optparser: OptionParser, pkginfo: attrdict, section: int = 1) -> str: long_desc = "" pgm = optparser.get_prog_name() short_desc = self.format_short_description(pgm, pkginfo.description) if hasattr(pkginfo, "long_desc"): long_desc = self.format_long_description(pgm, pkginfo.long_desc) return "%s\n%s\n%s\n%s" % ( self.format_title(pgm, section), short_desc, self.format_synopsis(pgm), long_desc, ) def format_title(self, pgm: str, section: int) -> str: date = "-".join([str(num) for num in time.localtime()[:3]]) return f'.TH {pgm} {section} "{date}" {pgm}' def format_short_description(self, pgm: str, short_desc: str) -> str: return r""".SH NAME .B %s \- %s """ % ( pgm, short_desc.strip(), ) def format_synopsis(self, pgm: str) -> str: return f""".SH SYNOPSIS .B {pgm} [ .I OPTIONS ] [ .I ] """ def format_long_description(self, pgm, long_desc): long_desc = "\n".join([line.lstrip() for line in long_desc.splitlines()]) long_desc = long_desc.replace("\n.\n", "\n\n") if long_desc.lower().startswith(pgm): long_desc = long_desc[len(pgm) :] return f""".SH DESCRIPTION .B {pgm} {long_desc.strip()} """ def format_tail(self, pkginfo: attrdict) -> str: tail = """.SH SEE ALSO /usr/share/doc/pythonX.Y-%s/ .SH BUGS Please report bugs on the project\'s mailing list: %s .SH AUTHOR %s <%s> """ % ( getattr(pkginfo, "debian_name", pkginfo.modname), pkginfo.mailinglist, pkginfo.author, pkginfo.author_email, ) if hasattr(pkginfo, "copyright"): tail += f""" .SH COPYRIGHT {pkginfo.copyright} """ return tail def generate_manpage( optparser: OptionParser, pkginfo: attrdict, section: int = 1, stream: StringIO = sys.stdout, level: int = 0, ) -> None: """generate a man page from an optik parser""" formatter = ManHelpFormatter() # mypy: "ManHelpFormatter" has no attribute "output_level" # dynamic attribute? formatter.output_level = level # type: ignore formatter.parser = optparser print(formatter.format_head(optparser, pkginfo, section), file=stream) print(optparser.format_option_help(formatter), file=stream) print(formatter.format_tail(pkginfo), file=stream) __all__ = ("OptionParser", "Option", "OptionGroup", "OptionValueError", "Values") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/proc.py0000666000000000000000000002162314762603732017471 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """module providing: * process information (linux specific: rely on /proc) * a class for resource control (memory / time / cpu time) This module doesn't work on windows platforms (only tested on linux) :organization: Logilab """ __docformat__ = "restructuredtext en" import os import stat from resource import getrlimit, setrlimit, RLIMIT_CPU, RLIMIT_AS from signal import signal, SIGXCPU, SIGKILL, SIGUSR2, SIGUSR1 from threading import Timer, currentThread, Thread, Event from time import time from logilab.common.tree import Node class NoSuchProcess(Exception): pass def proc_exists(pid): """check the a pid is registered in /proc raise NoSuchProcess exception if not """ if not os.path.exists(f"/proc/{pid}"): raise NoSuchProcess() PPID = 3 UTIME = 13 STIME = 14 CUTIME = 15 CSTIME = 16 VSIZE = 22 class ProcInfo(Node): """provide access to process information found in /proc""" def __init__(self, pid): self.pid = int(pid) Node.__init__(self, self.pid) proc_exists(self.pid) self.file = f"/proc/{self.pid}/stat" self.ppid = int(self.status()[PPID]) def memory_usage(self): """return the memory usage of the process in Ko""" try: return int(self.status()[VSIZE]) except OSError: return 0 def lineage_memory_usage(self): return self.memory_usage() + sum([child.lineage_memory_usage() for child in self.children]) def time(self, children=0): """return the number of jiffies that this process has been scheduled in user and kernel mode""" status = self.status() time = int(status[UTIME]) + int(status[STIME]) if children: time += int(status[CUTIME]) + int(status[CSTIME]) return time def status(self): """return the list of fields found in /proc//stat""" return open(self.file).read().split() def name(self): """return the process name found in /proc//stat""" return self.status()[1].strip("()") def age(self): """return the age of the process""" return os.stat(self.file)[stat.ST_MTIME] class ProcInfoLoader: """manage process information""" def __init__(self): self._loaded = {} def list_pids(self): """return a list of existent process ids""" for subdir in os.listdir("/proc"): if subdir.isdigit(): yield int(subdir) def load(self, pid): """get a ProcInfo object for a given pid""" pid = int(pid) try: return self._loaded[pid] except KeyError: procinfo = ProcInfo(pid) procinfo.manager = self self._loaded[pid] = procinfo return procinfo def load_all(self): """load all processes information""" for pid in self.list_pids(): try: procinfo = self.load(pid) if procinfo.parent is None and procinfo.ppid: pprocinfo = self.load(procinfo.ppid) pprocinfo.append(procinfo) except NoSuchProcess: pass class ResourceError(Exception): """Error raise when resource limit is reached""" limit = "Unknown Resource Limit" class XCPUError(ResourceError): """Error raised when CPU Time limit is reached""" limit = "CPU Time" class LineageMemoryError(ResourceError): """Error raised when the total amount of memory used by a process and it's child is reached""" limit = "Lineage total Memory" class TimeoutError(ResourceError): """Error raised when the process is running for to much time""" limit = "Real Time" # Can't use subclass because the StandardError MemoryError raised RESOURCE_LIMIT_EXCEPTION = (ResourceError, MemoryError) class MemorySentinel(Thread): """A class checking a process don't use too much memory in a separated daemonic thread """ def __init__(self, interval, memory_limit, gpid=None): Thread.__init__(self, target=self._run, name="Test.Sentinel") self.memory_limit = memory_limit self._stop = Event() self.interval = interval self.setDaemon(True) self.gpid = gpid if gpid is not None else os.getpid() def stop(self): """stop ap""" self._stop.set() def _run(self): pil = ProcInfoLoader() while not self._stop.isSet(): if self.memory_limit <= pil.load(self.gpid).lineage_memory_usage(): os.killpg(self.gpid, SIGUSR1) self._stop.wait(self.interval) class ResourceController: def __init__(self, max_cpu_time=None, max_time=None, max_memory=None, max_reprieve=60): if SIGXCPU == -1: raise RuntimeError("Unsupported platform") self.max_time = max_time self.max_memory = max_memory self.max_cpu_time = max_cpu_time self._reprieve = max_reprieve self._timer = None self._msentinel = None self._old_max_memory = None self._old_usr1_hdlr = None self._old_max_cpu_time = None self._old_usr2_hdlr = None self._old_sigxcpu_hdlr = None self._limit_set = 0 self._abort_try = 0 self._start_time = None self._elapse_time = 0 def _hangle_sig_timeout(self, sig, frame): raise TimeoutError() def _hangle_sig_memory(self, sig, frame): if self._abort_try < self._reprieve: self._abort_try += 1 raise LineageMemoryError("Memory limit reached") else: os.killpg(os.getpid(), SIGKILL) def _handle_sigxcpu(self, sig, frame): if self._abort_try < self._reprieve: self._abort_try += 1 raise XCPUError("Soft CPU time limit reached") else: os.killpg(os.getpid(), SIGKILL) def _time_out(self): if self._abort_try < self._reprieve: self._abort_try += 1 os.killpg(os.getpid(), SIGUSR2) if self._limit_set > 0: self._timer = Timer(1, self._time_out) self._timer.start() else: os.killpg(os.getpid(), SIGKILL) def setup_limit(self): """set up the process limit""" assert currentThread().getName() == "MainThread" os.setpgrp() if self._limit_set <= 0: if self.max_time is not None: self._old_usr2_hdlr = signal(SIGUSR2, self._hangle_sig_timeout) self._timer = Timer(max(1, int(self.max_time) - self._elapse_time), self._time_out) self._start_time = int(time()) self._timer.start() if self.max_cpu_time is not None: self._old_max_cpu_time = getrlimit(RLIMIT_CPU) cpu_limit = (int(self.max_cpu_time), self._old_max_cpu_time[1]) self._old_sigxcpu_hdlr = signal(SIGXCPU, self._handle_sigxcpu) setrlimit(RLIMIT_CPU, cpu_limit) if self.max_memory is not None: self._msentinel = MemorySentinel(1, int(self.max_memory)) self._old_max_memory = getrlimit(RLIMIT_AS) self._old_usr1_hdlr = signal(SIGUSR1, self._hangle_sig_memory) as_limit = (int(self.max_memory), self._old_max_memory[1]) setrlimit(RLIMIT_AS, as_limit) self._msentinel.start() self._limit_set += 1 def clean_limit(self): """reinstall the old process limit""" if self._limit_set > 0: if self.max_time is not None: self._timer.cancel() self._elapse_time += int(time()) - self._start_time self._timer = None signal(SIGUSR2, self._old_usr2_hdlr) if self.max_cpu_time is not None: setrlimit(RLIMIT_CPU, self._old_max_cpu_time) signal(SIGXCPU, self._old_sigxcpu_hdlr) if self.max_memory is not None: self._msentinel.stop() self._msentinel = None setrlimit(RLIMIT_AS, self._old_max_memory) signal(SIGUSR1, self._old_usr1_hdlr) self._limit_set -= 1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/py.typed0000666000000000000000000000000014762603732017635 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/registry.py0000666000000000000000000013146214762603732020401 0ustar00rootroot# copyright 2003-2013 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of Logilab-common. # # Logilab-common is free software: you can redistribute it and/or modify it # under the terms of the GNU Lesser General Public License as published by the # Free Software Foundation, either version 2.1 of the License, or (at your # option) any later version. # # Logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with Logilab-common. If not, see . """This module provides bases for predicates dispatching (the pattern in use here is similar to what's refered as multi-dispatch or predicate-dispatch in the literature, though a bit different since the idea is to select across different implementation 'e.g. classes), not to dispatch a message to a function or method. It contains the following classes: * :class:`RegistryStore`, the top level object which loads implementation objects and stores them into registries. You'll usually use it to access registries and their contained objects; * :class:`Registry`, the base class which contains objects semantically grouped (for instance, sharing a same API, hence the 'implementation' name). You'll use it to select the proper implementation according to a context. Notice you may use registries on their own without using the store. .. Note:: implementation objects are usually designed to be accessed through the registry and not by direct instantiation, besides to use it as base classe. The selection procedure is delegated to a selector, which is responsible for scoring the object according to some context. At the end of the selection, if an implementation has been found, an instance of this class is returned. A selector is built from one or more predicates combined together using AND, OR, NOT operators (actually `&`, `|` and `~`). You'll thus find some base classes to build predicates: * :class:`Predicate`, the abstract base predicate class * :class:`AndPredicate`, :class:`OrPredicate`, :class:`NotPredicate`, which you shouldn't have to use directly. You'll use `&`, `|` and '~' operators between predicates directly * :func:`objectify_predicate` You'll eventually find one concrete predicate: :class:`yes` .. autoclass:: RegistryStore .. autoclass:: Registry Predicates ---------- .. autoclass:: Predicate .. autofunction:: objectify_predicate .. autoclass:: yes .. autoclass:: AndPredicate .. autoclass:: OrPredicate .. autoclass:: NotPredicate Debugging --------- .. autoclass:: traced_selection Exceptions ---------- .. autoclass:: RegistryException .. autoclass:: RegistryNotFound .. autoclass:: ObjectNotFound .. autoclass:: NoSelectableObject """ __docformat__ = "restructuredtext en" import sys import importlib import types import weakref from os import listdir, stat from os.path import join, isdir, exists from typing import Dict, Type, Optional, Union, Sequence from logging import getLogger from warnings import warn from typing import List, Tuple, Any, Iterable, Callable from types import ModuleType if sys.version_info >= (3, 8): from typing import TypedDict # TypedDict is also available in typing_extension > 3.7.4, # but this package is not available on debian buster. from logilab.common.modutils import modpath_from_file from logilab.common.logging_ext import set_log_methods from logilab.common.decorators import classproperty # selector base classes and operations ######################################## def objectify_predicate(selector_func: Callable) -> Any: """Most of the time, a simple score function is enough to build a selector. The :func:`objectify_predicate` decorator turn it into a proper selector class:: @objectify_predicate def one(cls, req, rset=None, **kwargs): return 1 class MyView(View): __select__ = View.__select__ & one() """ return type( selector_func.__name__, (Predicate,), { "__doc__": selector_func.__doc__, "__call__": lambda self, *a, **kw: selector_func(*a, **kw), }, ) _PREDICATES: Dict[int, Type] = {} def wrap_predicates(decorator: Callable) -> None: for predicate in _PREDICATES.values(): if "_decorators" not in predicate.__dict__: predicate._decorators = set() if decorator in predicate._decorators: continue predicate._decorators.add(decorator) predicate.__call__ = decorator(predicate.__call__) class PredicateMetaClass(type): def __new__(mcs, *args, **kwargs): # use __new__ so subclasses doesn't have to call Predicate.__init__ inst = type.__new__(mcs, *args, **kwargs) proxy = weakref.proxy(inst, lambda p: _PREDICATES.pop(id(p))) _PREDICATES[id(proxy)] = proxy return inst class Predicate(object, metaclass=PredicateMetaClass): """base class for selector classes providing implementation for operators ``&``, ``|`` and ``~`` This class is only here to give access to binary operators, the selector logic itself should be implemented in the :meth:`__call__` method. Notice it should usually accept any arbitrary arguments (the context), though that may vary depending on your usage of the registry. a selector is called to help choosing the correct object for a particular context by returning a score (`int`) telling how well the implementation given as first argument fit to the given context. 0 score means that the class doesn't apply. """ @property def func_name(self): # backward compatibility return self.__class__.__name__ def search_selector(self, selector: "Predicate") -> Optional["Predicate"]: """search for the given selector, selector instance or tuple of selectors in the selectors tree. Return None if not found. """ if self is selector: return self if (isinstance(selector, type) or isinstance(selector, tuple)) and isinstance( self, selector ): return self return None def __str__(self): return self.__class__.__name__ def __and__(self, other: "Predicate") -> "AndPredicate": return AndPredicate(self, other) def __rand__(self, other: "Predicate") -> "AndPredicate": return AndPredicate(other, self) def __iand__(self, other: "Predicate") -> "AndPredicate": return AndPredicate(self, other) def __or__(self, other: "Predicate") -> "OrPredicate": return OrPredicate(self, other) def __ror__(self, other: "Predicate"): return OrPredicate(other, self) def __ior__(self, other: "Predicate") -> "OrPredicate": return OrPredicate(self, other) def __invert__(self): return NotPredicate(self) # XXX (function | function) or (function & function) not managed yet def __call__(self, cls, *args, **kwargs): return NotImplementedError( f"selector {self.__class__} must implement its logic in its __call__ method" ) def __repr__(self): return f"" class MultiPredicate(Predicate): """base class for compound selector classes""" def __init__(self, *selectors: Any) -> None: self.selectors = self.merge_selectors(selectors) def __str__(self): return f"{self.__class__.__name__}({','.join(str(s) for s in self.selectors)})" @classmethod def merge_selectors(cls, selectors: Sequence[Predicate]) -> List[Predicate]: """deal with selector instanciation when necessary and merge multi-selectors if possible: AndPredicate(AndPredicate(sel1, sel2), AndPredicate(sel3, sel4)) ==> AndPredicate(sel1, sel2, sel3, sel4) """ merged_selectors = [] for selector in selectors: # XXX do we really want magic-transformations below? # if so, wanna warn about them? if isinstance(selector, types.FunctionType): selector = objectify_predicate(selector)() if isinstance(selector, type) and issubclass(selector, Predicate): selector = selector() assert isinstance(selector, Predicate), selector if isinstance(selector, cls): merged_selectors += selector.selectors else: merged_selectors.append(selector) return merged_selectors def search_selector(self, selector: Predicate) -> Optional[Predicate]: """search for the given selector or selector instance (or tuple of selectors) in the selectors tree. Return None if not found """ for childselector in self.selectors: if childselector is selector: return childselector found = childselector.search_selector(selector) if found is not None: return found # if not found in children, maybe we are looking for self? return super(MultiPredicate, self).search_selector(selector) class AndPredicate(MultiPredicate): """and-chained selectors""" def __call__(self, cls: Optional[Any], *args: Any, **kwargs: Any) -> int: score = 0 for selector in self.selectors: partscore = selector(cls, *args, **kwargs) if not partscore: return 0 score += partscore return score class OrPredicate(MultiPredicate): """or-chained selectors""" def __call__(self, cls: Optional[Any], *args: Any, **kwargs: Any) -> int: for selector in self.selectors: partscore = selector(cls, *args, **kwargs) if partscore: return partscore return 0 class NotPredicate(Predicate): """negation selector""" def __init__(self, selector): self.selector = selector def __call__(self, cls, *args, **kwargs): score = self.selector(cls, *args, **kwargs) return int(not score) def __str__(self): return f"NOT({self.selector})" class yes(Predicate): # pylint: disable=C0103 """Return the score given as parameter, with a default score of 0.5 so any other selector take precedence. Usually used for objects which can be selected whatever the context, or also sometimes to add arbitrary points to a score. Take care, `yes(0)` could be named 'no'... """ def __init__(self, score: float = 0.5) -> None: self.score = score def __call__(self, *args, **kwargs): return self.score # deprecated stuff ############################################################# class RegistryException(Exception): """Base class for registry exception.""" class RegistryNotFound(RegistryException): """Raised when an unknown registry is requested. This is usually a programming/typo error. """ class ObjectNotFound(RegistryException): """Raised when an unregistered object is requested. This may be a programming/typo or a misconfiguration error. """ class NoSelectableObject(RegistryException): """Raised when no object is selectable for a given context.""" def __init__(self, args, kwargs, objects): self.args = args self.kwargs = kwargs self.objects = objects def __str__(self): return "args: %s, kwargs: %s\ncandidates: %s" % ( self.args, self.kwargs.keys(), self.objects, ) class SelectAmbiguity(RegistryException): """Raised when several objects compete at selection time with an equal score. """ def _modname_from_path(path: str, extrapath: Optional[Any] = None) -> str: modpath = modpath_from_file(path, extrapath) # omit '__init__' from package's name to avoid loading that module # once for each name when it is imported by some other object # module. This supposes import in modules are done as:: # # from package import something # # not:: # # from package.__init__ import something # # which seems quite correct. if modpath[-1] == "__init__": modpath.pop() return ".".join(modpath) def _toload_info( path: List[str], extrapath: Optional[Any], _toload: Optional[Tuple[Dict[str, str], List]] = None ) -> Tuple[Dict[str, str], List[Tuple[str, str]]]: """Return a dictionary of : and an ordered list of (file, module name) to load """ if _toload is None: assert isinstance(path, list) _toload = {}, [] for fileordir in path: if isdir(fileordir) and exists(join(fileordir, "__init__.py")): subfiles = [join(fileordir, fname) for fname in listdir(fileordir)] _toload_info(subfiles, extrapath, _toload) elif fileordir[-3:] == ".py": modname = _modname_from_path(fileordir, extrapath) _toload[0][modname] = fileordir _toload[1].append((fileordir, modname)) return _toload class RegistrableObject: """This is the base class for registrable objects which are selected according to a context. :attr:`__registry__` name of the registry for this object (string like 'views', 'templates'...). You may want to define `__registries__` directly if your object should be registered in several registries. :attr:`__regid__` object's identifier in the registry (string like 'main', 'primary', 'folder_box') :attr:`__select__` class'selector Moreover, the `__abstract__` attribute may be set to True to indicate that a class is abstract and should not be registered. You don't have to inherit from this class to put it in a registry (having `__regid__` and `__select__` is enough), though this is needed for classes that should be automatically registered. """ __registry__: Optional[str] = None __regid__: Optional[str] = None __select__: Union[None, str, Predicate] = None __abstract__ = True # see doc snipppets below (in Registry class) @classproperty def __registries__(cls) -> Union[Tuple[str], Tuple]: if cls.__registry__ is None: return () return (cls.__registry__,) class RegistrableInstance(RegistrableObject): """Inherit this class if you want instances of the classes to be automatically registered. """ def __new__(cls, *args, **kwargs): """Add a __module__ attribute telling the module where the instance was created, for automatic registration. """ module = kwargs.pop("__module__", None) or getattr(cls, "__module__") obj = super(RegistrableInstance, cls).__new__(cls) obj.__module__ = module return obj def __init__(self, __module__: Optional[str] = None) -> None: super(RegistrableInstance, self).__init__() if sys.version_info >= (3, 8): SelectBestReport = TypedDict( "SelectBestReport", { "all_objects": List, "end_score": int, "winners": List, "winner": Optional[Any], "self": "Registry", "args": List, "kwargs": Dict, "registry": "Registry", }, ) else: SelectBestReport = Dict[str, Union[List, int, Optional[Any], "Registry", Dict]] class Registry(dict): """The registry store a set of implementations associated to identifier: * to each identifier are associated a list of implementations * to select an implementation of a given identifier, you should use one of the :meth:`select` or :meth:`select_or_none` method * to select a list of implementations for a context, you should use the :meth:`possible_objects` method * dictionary like access to an identifier will return the bare list of implementations for this identifier. To be usable in a registry, the only requirement is to have a `__select__` attribute. At the end of the registration process, the :meth:`__registered__` method is called on each registered object which have them, given the registry in which it's registered as argument. Registration methods: .. automethod:: register .. automethod:: unregister Selection methods: .. automethod:: select .. automethod:: select_or_none .. automethod:: possible_objects .. automethod:: object_by_id """ def __init__(self, debugmode: bool) -> None: super(Registry, self).__init__() self.debugmode = debugmode self._select_listeners: List[Callable[[SelectBestReport], None]] = [] def __getitem__(self, name): """return the registry (list of implementation objects) associated to this name """ try: return super(Registry, self).__getitem__(name) except KeyError: exc = ObjectNotFound(name) exc.__traceback__ = sys.exc_info()[-1] raise exc @classmethod def objid(cls, obj: Any) -> str: """returns a unique identifier for an object stored in the registry""" return f"{obj.__module__}.{cls.objname(obj)}" @classmethod def objname(cls, obj: Any) -> str: """returns a readable name for an object stored in the registry""" return getattr(obj, "__name__", id(obj)) def initialization_completed(self) -> None: """call method __registered__() on registered objects when the callback is defined""" for objects in self.values(): for objectcls in objects: registered = getattr(objectcls, "__registered__", None) if registered: registered(self) if self.debugmode: wrap_predicates(_lltrace) def register(self, obj: Any, oid: Optional[Any] = None, clear: bool = False) -> None: """base method to add an object in the registry""" assert "__abstract__" not in obj.__dict__, obj assert obj.__select__, obj oid = oid or obj.__regid__ assert ( oid ), f"no explicit name supplied to register object {obj}, which has no __regid__ set" if clear: objects = self[oid] = [] else: objects = self.setdefault(oid, []) assert obj not in objects, f"object {obj} is already registered" objects.append(obj) def register_and_replace(self, obj, replaced): """remove and register """ # XXXFIXME this is a duplication of unregister() # remove register_and_replace in favor of unregister + register # or simplify by calling unregister then register here if not isinstance(replaced, str): replaced = self.objid(replaced) # prevent from misspelling assert obj is not replaced, f"replacing an object by itself: {obj}" registered_objs = self.get(obj.__regid__, ()) for index, registered in enumerate(registered_objs): if self.objid(registered) == replaced: del registered_objs[index] break else: self.warning("trying to replace %s that is not registered with %s", replaced, obj) self.register(obj) def unregister(self, obj): """remove object from this registry""" objid = self.objid(obj) oid = obj.__regid__ for registered in self.get(oid, ()): # use self.objid() to compare objects because vreg will probably # have its own version of the object, loaded through execfile if self.objid(registered) == objid: self[oid].remove(registered) break else: self.warning("can't remove %s, no id %s in the registry", objid, oid) def all_objects(self): """return a list containing all objects in this registry.""" result = [] for objs in self.values(): result += objs return result # dynamic selection methods ################################################ def object_by_id(self, oid, *args, **kwargs): """return object with the `oid` identifier. Only one object is expected to be found. raise :exc:`ObjectNotFound` if there are no object with id `oid` in this registry raise :exc:`AssertionError` if there is more than one object there """ objects = self[oid] assert len(objects) == 1, objects return objects[0](*args, **kwargs) def select(self, __oid, *args, **kwargs): """return the most specific object among those with the given oid according to the given context. raise :exc:`ObjectNotFound` if there are no object with id `oid` in this registry raise :exc:`NoSelectableObject` if no object can be selected """ obj = self._select_best(self[__oid], *args, **kwargs) if obj is None: raise NoSelectableObject(args, kwargs, self[__oid]) return obj def select_or_none(self, __oid, *args, **kwargs): """return the most specific object among those with the given oid according to the given context, or None if no object applies. """ try: return self._select_best(self[__oid], *args, **kwargs) except ObjectNotFound: return None def possible_objects(self, *args, **kwargs): """return an iterator on possible objects in this registry for the given context """ for objects in self.values(): obj = self._select_best(objects, *args, **kwargs) if obj is None: continue yield obj def add_select_best_listener(self, listener): """ Add a listener to the list of one parameters callables (function/method) that will be called everytime the selection Registry._select_best is done and they will recieve a dict of the following form:: {"all_objects": [], "end_score": 0, "winners": [], "winner": None or winner, "self": self, "args": args, "kwargs": kwargs, } """ self._select_listeners.append(listener) def _select_best(self, objects, *args, **kwargs): """return an instance of the most specific object according to parameters return None if not object apply (don't raise `NoSelectableObject` since it's costly when searching objects using `possible_objects` (e.g. searching for hooks). Every callable stored in Registry._select_listeners will be called once the selection is done and will recieve a dict of the following form:: {"all_objects": [], "end_score": 0, "winners": [], "winner": None or winner, "self": self, "args": args, "kwargs": kwargs, "registry": self} """ if self._select_listeners: select_best_report: SelectBestReport = { "registry": self, "all_objects": [], "end_score": 0, "winners": [], "winner": None, "self": self, "args": args, "kwargs": kwargs, } score, winners = 0, None for obj in objects: objectscore = obj.__select__(obj, *args, **kwargs) if self._select_listeners: select_best_report["all_objects"].append({"object": obj, "score": objectscore}) if objectscore > score: score, winners = objectscore, [obj] elif objectscore > 0 and objectscore == score: winners.append(obj) if self._select_listeners: select_best_report["winners"] = winners.copy() if winners else [] select_best_report["winner"] = winners[0] if winners else winners select_best_report["end_score"] = score for listener in self._select_listeners: listener(select_best_report) if winners is None: return None if len(winners) > 1: # log in production environement / test, error while debugging msg = "select ambiguity: %s\n(args: %s, kwargs: %s)" if self.debugmode: # raise bare exception in debug mode raise SelectAmbiguity(msg % (winners, args, kwargs.keys())) self.error(msg, winners, args, kwargs.keys()) # return the result of calling the object return self.selected(winners[0], args, kwargs) def selected(self, winner, args, kwargs): """override here if for instance you don't want "instanciation" """ return winner(*args, **kwargs) # these are overridden by set_log_methods below # only defining here to prevent pylint from complaining info = warning = error = critical = exception = debug = lambda msg, *a, **kw: None def obj_registries(cls: Any, registryname: Optional[Any] = None) -> Tuple[str]: """return a tuple of registry names (see __registries__)""" if registryname: return (registryname,) return cls.__registries__ class RegistryStore(dict): """This class is responsible for loading objects and storing them in their registry which is created on the fly as needed. It handles dynamic registration of objects and provides a convenient api to access them. To be recognized as an object that should be stored into one of the store's registry (:class:`Registry`), an object must provide the following attributes, used control how they interact with the registry: :attr:`__registries__` list of registry names (string like 'views', 'templates'...) into which the object should be registered :attr:`__regid__` object identifier in the registry (string like 'main', 'primary', 'folder_box') :attr:`__select__` the object predicate selectors Moreover, the :attr:`__abstract__` attribute may be set to `True` to indicate that an object is abstract and should not be registered (such inherited attributes not considered). .. Note:: When using the store to load objects dynamically, you *always* have to use **super()** to get the methods and attributes of the superclasses, and not use the class identifier. If not, you'll get into trouble at reload time. For example, instead of writing:: class Thing(Parent): __regid__ = 'athing' __select__ = yes() def f(self, arg1): Parent.f(self, arg1) You must write:: class Thing(Parent): __regid__ = 'athing' __select__ = yes() def f(self, arg1): super(Thing, self).f(arg1) Controlling object registration ------------------------------- Dynamic loading is triggered by calling the :meth:`register_modnames` method, given a list of modules names to inspect. .. automethod:: register_modnames For each module, by default, all compatible objects are registered automatically. However if some objects come as replacement of other objects, or have to be included only if some condition is met, you'll have to define a `registration_callback(vreg)` function in the module and explicitly register **all objects** in this module, using the api defined below. .. automethod:: RegistryStore.register_all .. automethod:: RegistryStore.register_and_replace .. automethod:: RegistryStore.register .. automethod:: RegistryStore.unregister .. Note:: Once the function `registration_callback(vreg)` is implemented in a module, all the objects from this module have to be explicitly registered as it disables the automatic object registration. Examples: .. sourcecode:: python def registration_callback(store): # register everything in the module except BabarClass store.register_all(globals().values(), __name__, (BabarClass,)) # conditionally register BabarClass if 'babar_relation' in store.schema: store.register(BabarClass) In this example, we register all application object classes defined in the module except `BabarClass`. This class is then registered only if the 'babar_relation' relation type is defined in the instance schema. .. sourcecode:: python def registration_callback(store): store.register(Elephant) # replace Babar by Celeste store.register_and_replace(Celeste, Babar) In this example, we explicitly register classes one by one: * the `Elephant` class * the `Celeste` to replace `Babar` If at some point we register a new appobject class in this module, it won't be registered at all without modification to the `registration_callback` implementation. The first example will register it though, thanks to the call to the `register_all` method. Controlling registry instantiation ---------------------------------- The `REGISTRY_FACTORY` class dictionary allows to specify which class should be instantiated for a given registry name. The class associated to `None` key will be the class used when there is no specific class for a name. """ def __init__(self, debugmode: bool = False) -> None: super(RegistryStore, self).__init__() self.debugmode = debugmode def reset(self) -> None: """clear all registries managed by this store""" # don't use self.clear, we want to keep existing subdictionaries for subdict in self.values(): subdict.clear() self._lastmodifs: Dict[str, int] = {} def __getitem__(self, name: str) -> Registry: """return the registry (dictionary of class objects) associated to this name """ try: return super(RegistryStore, self).__getitem__(name) except KeyError: exc = RegistryNotFound(name) exc.__traceback__ = sys.exc_info()[-1] raise exc # methods for explicit (un)registration ################################### # default class, when no specific class set REGISTRY_FACTORY: Dict[Union[None, str], type] = {None: Registry} def registry_class(self, regid: str) -> type: """return existing registry named regid or use factory to create one and return it""" try: return self.REGISTRY_FACTORY[regid] except KeyError: return self.REGISTRY_FACTORY[None] # mypy: Signature of "setdefault" incompatible with supertype "MutableMapping"" # I don't know how to overload signatures of method in mypy def setdefault(self, regid: str) -> Registry: # type: ignore try: return self[regid] except RegistryNotFound: self[regid] = self.registry_class(regid)(self.debugmode) return self[regid] def register_all(self, objects: Iterable, modname: str, butclasses: Sequence = ()) -> None: """register registrable objects into `objects`. Registrable objects are properly configured subclasses of :class:`RegistrableObject`. Objects which are not defined in the module `modname` or which are in `butclasses` won't be registered. Typical usage is: .. sourcecode:: python store.register_all(globals().values(), __name__, (ClassIWantToRegisterExplicitly,)) So you get partially automatic registration, keeping manual registration for some object (to use :meth:`~logilab.common.registry.RegistryStore.register_and_replace` for instance). """ assert isinstance( modname, str ), f"modname expected to be a module name (ie string), got {modname!r}" for obj in objects: if self.is_registrable(obj) and obj.__module__ == modname and obj not in butclasses: if isinstance(obj, type): self._load_ancestors_then_object(modname, obj, butclasses) else: self.register(obj) def register( self, obj: Any, registryname: Optional[Any] = None, oid: Optional[Any] = None, clear: bool = False, ) -> None: """register `obj` implementation into `registryname` or `obj.__registries__` if not specified, with identifier `oid` or `obj.__regid__` if not specified. If `clear` is true, all objects with the same identifier will be previously unregistered. """ assert not obj.__dict__.get("__abstract__"), obj for registryname in obj_registries(obj, registryname): registry = self.setdefault(registryname) registry.register(obj, oid=oid, clear=clear) self.debug( "register %s in %s['%s']", registry.objname(obj), registryname, oid or obj.__regid__ ) self._loadedmods.setdefault(obj.__module__, {})[registry.objid(obj)] = obj def unregister(self, obj, registryname=None): """unregister `obj` object from the registry `registryname` or `obj.__registries__` if not specified. """ for registryname in obj_registries(obj, registryname): registry = self[registryname] registry.unregister(obj) self.debug( "unregister %s from %s['%s']", registry.objname(obj), registryname, obj.__regid__ ) def register_and_replace(self, obj, replaced, registryname=None): """register `obj` object into `registryname` or `obj.__registries__` if not specified. If found, the `replaced` object will be unregistered first (else a warning will be issued as it is generally unexpected). """ for registryname in obj_registries(obj, registryname): registry = self[registryname] registry.register_and_replace(obj, replaced) self.debug( "register %s in %s['%s'] instead of %s", registry.objname(obj), registryname, obj.__regid__, registry.objname(replaced), ) # initialization methods ################################################### def init_registration( self, path: List[str], extrapath: Optional[Any] = None ) -> List[Tuple[str, str]]: """reset registry and walk down path to return list of (path, name) file modules to be loaded""" # XXX make this private by renaming it to _init_registration ? self.reset() # compute list of all modules that have to be loaded self._toloadmods, filemods = _toload_info(path, extrapath) # XXX is _loadedmods still necessary ? It seems like it's useful # to avoid loading same module twice, especially with the # _load_ancestors_then_object logic but this needs to be checked self._loadedmods: Dict[str, Dict[str, type]] = {} return filemods def register_modnames(self, modnames: List[str]) -> None: """register all objects found in """ self.reset() self._loadedmods = {} self._toloadmods = {} toload = [] for modname in modnames: loader = importlib.util.find_spec(modname).loader assert loader is not None # mypy: "Loader" has no attribute "get_filename" # the selected class has one filepath = loader.get_filename() # type: ignore if filepath[-4:] in (".pyc", ".pyo"): # The source file *must* exists filepath = filepath[:-1] self._toloadmods[modname] = filepath toload.append((filepath, modname)) for filepath, modname in toload: self.load_file(filepath, modname) self.initialization_completed() def initialization_completed(self) -> None: """call initialization_completed() on all known registries""" for reg in self.values(): reg.initialization_completed() def _mdate(self, filepath: str) -> Optional[int]: """return the modification date of a file path""" try: return stat(filepath)[-2] except OSError: # this typically happens on emacs backup files (.#foo.py) self.warning("Unable to load %s. It is likely to be a backup file", filepath) return None def is_reload_needed(self, path): """return True if something module changed and the registry should be reloaded """ lastmodifs = self._lastmodifs for fileordir in path: if isdir(fileordir) and exists(join(fileordir, "__init__.py")): if self.is_reload_needed([join(fileordir, fname) for fname in listdir(fileordir)]): return True elif fileordir[-3:] == ".py": mdate = self._mdate(fileordir) if mdate is None: continue # backup file, see _mdate implementation elif "flymake" in fileordir: # flymake + pylint in use, don't consider these they will corrupt the registry continue if fileordir not in lastmodifs or lastmodifs[fileordir] < mdate: self.info("File %s changed since last visit", fileordir) return True return False def load_file(self, filepath: str, modname: str) -> None: """load registrable objects (if any) from a python file""" if modname in self._loadedmods: return self._loadedmods[modname] = {} mdate = self._mdate(filepath) if mdate is None: return # backup file, see _mdate implementation elif "flymake" in filepath: # flymake + pylint in use, don't consider these they will corrupt the registry return # set update time before module loading, else we get some reloading # weirdness in case of syntax error or other error while importing the # module self._lastmodifs[filepath] = mdate # load the module if sys.version_info < (3,) and not isinstance(modname, str): modname = str(modname) module = __import__(modname, fromlist=modname.split(".")[:-1]) self.load_module(module) def load_module(self, module: ModuleType) -> None: """Automatically handle module objects registration. Instances are registered as soon as they are hashable and have the following attributes: * __regid__ (a string) * __select__ (a callable) * __registries__ (a tuple/list of string) For classes this is a bit more complicated : - first ensure parent classes are already registered - class with __abstract__ == True in their local dictionary are skipped - object class needs to have registries and identifier properly set to a non empty string to be registered. """ self.info("loading %s from %s", module.__name__, module.__file__) if hasattr(module, "registration_callback"): # mypy: Module has no attribute "registration_callback" # we check that before module.registration_callback(self) # type: ignore else: self.register_all(vars(module).values(), module.__name__) def _load_ancestors_then_object( self, modname: str, objectcls: type, butclasses: Sequence[Any] = () ) -> None: """handle class registration according to rules defined in :meth:`load_module` """ # backward compat, we used to allow whatever else than classes if not isinstance(objectcls, type): if self.is_registrable(objectcls) and objectcls.__module__ == modname: self.register(objectcls) return # imported classes objmodname = objectcls.__module__ if objmodname != modname: # The module of the object is not the same as the currently # worked on module, or this is actually an instance, which # has no module at all if objmodname in self._toloadmods: # if this is still scheduled for loading, let's proceed immediately, # but using the object module self.load_file(self._toloadmods[objmodname], objmodname) return # ensure object hasn't been already processed clsid = f"{modname}.{objectcls.__name__}" if clsid in self._loadedmods[modname]: return self._loadedmods[modname][clsid] = objectcls # ensure ancestors are registered for parent in objectcls.__bases__: self._load_ancestors_then_object(modname, parent, butclasses) # ensure object is registrable if objectcls in butclasses or not self.is_registrable(objectcls): return # backward compat reg = self.setdefault(obj_registries(objectcls)[0]) if reg.objname(objectcls)[0] == "_": warn( "[lgc 0.59] object whose name start with '_' won't be " "skipped anymore at some point, use __abstract__ = True " "instead (%s)" % objectcls, DeprecationWarning, ) return # register, finally self.register(objectcls) @classmethod def is_registrable(cls, obj: Any) -> bool: """ensure `obj` should be registered as arbitrary stuff may be registered, do a lot of check and warn about weird cases (think to dumb proxy objects) """ if isinstance(obj, type): if not issubclass(obj, RegistrableObject): # ducktyping backward compat if not ( getattr(obj, "__registries__", None) and getattr(obj, "__regid__", None) and getattr(obj, "__select__", None) ): return False elif issubclass(obj, RegistrableInstance): return False elif not isinstance(obj, RegistrableInstance): return False if not obj.__regid__: return False # no regid registries = obj.__registries__ if not registries: return False # no registries selector = obj.__select__ if not selector: return False # no selector if obj.__dict__.get("__abstract__", False): return False # then detect potential problems that should be warned if not isinstance(registries, (tuple, list)): cls.warning("%s has __registries__ which is not a list or tuple", obj) return False if not callable(selector): cls.warning("%s has not callable __select__", obj) return False return True # these are overridden by set_log_methods below # only defining here to prevent pylint from complaining info = warning = error = critical = exception = debug = lambda msg, *a, **kw: None # init logging set_log_methods(RegistryStore, getLogger("registry.store")) set_log_methods(Registry, getLogger("registry")) # helpers for debugging selectors TRACED_OIDS = None def _trace_selector(cls, selector, args, ret): vobj = args[0] if TRACED_OIDS == "all" or vobj.__regid__ in TRACED_OIDS: print(f"{cls} -> {ret} for {vobj}({vobj.__regid__})") def _lltrace(selector): """use this decorator on your predicates so they become traceable with :class:`traced_selection` """ def traced(cls, *args, **kwargs): ret = selector(cls, *args, **kwargs) if TRACED_OIDS is not None: _trace_selector(cls, selector, args, ret) return ret traced.__name__ = selector.__name__ traced.__doc__ = selector.__doc__ return traced class traced_selection(object): # pylint: disable=C0103 """ Typical usage is : .. sourcecode:: python >>> from logilab.common.registry import traced_selection >>> with traced_selection(): ... # some code in which you want to debug selectors ... # for all objects This will yield lines like this in the logs:: selector one_line_rset returned 0 for You can also give to :class:`traced_selection` the identifiers of objects on which you want to debug selection ('oid1' and 'oid2' in the example above). .. sourcecode:: python >>> with traced_selection( ('regid1', 'regid2') ): ... # some code in which you want to debug selectors ... # for objects with __regid__ 'regid1' and 'regid2' A potentially useful point to set up such a tracing function is the `logilab.common.registry.Registry.select` method body. """ def __init__(self, traced="all"): self.traced = traced def __enter__(self): global TRACED_OIDS TRACED_OIDS = self.traced def __exit__(self, exctype, exc, traceback): global TRACED_OIDS TRACED_OIDS = None return traceback is None ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/shellutils.py0000666000000000000000000003002314762603732020710 0ustar00rootroot# copyright 2003-2014 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """shell/term utilities, useful to write some python scripts instead of shell scripts. """ __docformat__ = "restructuredtext en" import os import glob import shutil import sys import tempfile import fnmatch import string import random import warnings from os.path import exists, isdir, islink, basename, join from _io import StringIO from typing import Any, Callable, Optional, List, Union, Iterator, Tuple from logilab.common import STD_BLACKLIST, _handle_blacklist class tempdir: def __enter__(self): self.path = tempfile.mkdtemp() return self.path def __exit__(self, exctype, value, traceback): # rmtree in all cases shutil.rmtree(self.path) return traceback is None class pushd: def __init__(self, directory): self.directory = directory def __enter__(self): self.cwd = os.getcwd() os.chdir(self.directory) return self.directory def __exit__(self, exctype, value, traceback): os.chdir(self.cwd) def chown(path, login=None, group=None): """Same as `os.chown` function but accepting user login or group name as argument. If login or group is omitted, it's left unchanged. Note: you must own the file to chown it (or be root). Otherwise OSError is raised. """ if login is None: uid = -1 else: try: uid = int(login) except ValueError: import pwd # Platforms: Unix uid = pwd.getpwnam(login).pw_uid if group is None: gid = -1 else: try: gid = int(group) except ValueError: import grp gid = grp.getgrnam(group).gr_gid os.chown(path, uid, gid) def mv(source, destination, _action=shutil.move): """A shell-like mv, supporting wildcards.""" sources = glob.glob(source) if len(sources) > 1: assert isdir(destination) for filename in sources: _action(filename, join(destination, basename(filename))) else: try: source = sources[0] except IndexError: raise OSError(f"No file matching {source}") if isdir(destination) and exists(destination): destination = join(destination, basename(source)) try: _action(source, destination) except OSError as ex: raise OSError(f"Unable to move {source!r} to {destination!r} ({ex})") def rm(*files): """A shell-like rm, supporting wildcards.""" for wfile in files: for filename in glob.glob(wfile): if islink(filename): os.remove(filename) elif isdir(filename): shutil.rmtree(filename) else: os.remove(filename) def cp(source, destination): """A shell-like cp, supporting wildcards.""" mv(source, destination, _action=shutil.copy) def find( directory: str, exts: Union[Tuple[str, ...], str], exclude: bool = False, blacklist: Tuple[str, ...] = STD_BLACKLIST, ) -> List[str]: """Recursively find files ending with the given extensions from the directory. :type directory: str :param directory: directory where the search should start :type exts: basestring or list or tuple :param exts: extensions or lists or extensions to search :type exclude: boolean :param exts: if this argument is True, returning files NOT ending with the given extensions :type blacklist: list or tuple :param blacklist: optional list of files or directory to ignore, default to the value of `logilab.common.STD_BLACKLIST` :rtype: list :return: the list of all matching files """ if isinstance(exts, str): exts = (exts,) if exclude: def match(filename: str, exts: Tuple[str, ...]) -> bool: for ext in exts: if filename.endswith(ext): return False return True else: def match(filename: str, exts: Tuple[str, ...]) -> bool: for ext in exts: if filename.endswith(ext): return True return False files = [] for dirpath, dirnames, filenames in os.walk(directory): _handle_blacklist(blacklist, dirnames, filenames) # don't append files if the directory is blacklisted dirname = basename(dirpath) if dirname in blacklist: continue files.extend([join(dirpath, f) for f in filenames if match(f, exts)]) return files def globfind( directory: str, pattern: str, blacklist: Tuple[str, str, str, str, str, str, str, str] = STD_BLACKLIST, ) -> Iterator[str]: """Recursively finds files matching glob `pattern` under `directory`. This is an alternative to `logilab.common.shellutils.find`. :type directory: str :param directory: directory where the search should start :type pattern: basestring :param pattern: the glob pattern (e.g *.py, foo*.py, etc.) :type blacklist: list or tuple :param blacklist: optional list of files or directory to ignore, default to the value of `logilab.common.STD_BLACKLIST` :rtype: iterator :return: iterator over the list of all matching files """ for curdir, dirnames, filenames in os.walk(directory): _handle_blacklist(blacklist, dirnames, filenames) for fname in fnmatch.filter(filenames, pattern): yield join(curdir, fname) def unzip(archive, destdir): import zipfile if not exists(destdir): os.mkdir(destdir) zfobj = zipfile.ZipFile(archive) for name in zfobj.namelist(): if name.endswith("/"): os.mkdir(join(destdir, name)) else: outfile = open(join(destdir, name), "wb") outfile.write(zfobj.read(name)) outfile.close() class ProgressBar: """A simple text progression bar.""" def __init__( self, nbops: int, size: int = 20, stream: StringIO = sys.stdout, title: str = "" ) -> None: if title: self._fstr = f"\r{title} [%-{int(size)}s]" else: self._fstr = f"\r[%-{int(size)}s]" self._stream = stream self._total = nbops self._size = size self._current = 0 self._progress = 0 self._current_text = None self._last_text_write_size = 0 def _get_text(self): return self._current_text def _set_text(self, text=None): if text != self._current_text: self._current_text = text self.refresh() def _del_text(self): self.text = None text = property(_get_text, _set_text, _del_text) def update(self, offset: int = 1, exact: bool = False) -> None: """Move FORWARD to new cursor position (cursor will never go backward). :offset: fraction of ``size`` :exact: - False: offset relative to current cursor position if True - True: offset as an asbsolute position """ if exact: self._current = offset else: self._current += offset progress = int((float(self._current) / float(self._total)) * self._size) if progress > self._progress: self._progress = progress self.refresh() def refresh(self) -> None: """Refresh the progression bar display.""" self._stream.write(self._fstr % ("=" * min(self._progress, self._size))) if self._last_text_write_size or self._current_text: template = " %%-%is" % (self._last_text_write_size) text = self._current_text if text is None: text = "" self._stream.write(template % text) self._last_text_write_size = len(text.rstrip()) self._stream.flush() def finish(self): self._stream.write("\n") self._stream.flush() class DummyProgressBar: __slots__ = ("text",) def refresh(self): pass def update(self): pass def finish(self): pass _MARKER = object() class progress: def __init__(self, nbops=_MARKER, size=_MARKER, stream=_MARKER, title=_MARKER, enabled=True): self.nbops = nbops self.size = size self.stream = stream self.title = title self.enabled = enabled def __enter__(self): if self.enabled: kwargs = {} for attr in ("nbops", "size", "stream", "title"): value = getattr(self, attr) if value is not _MARKER: kwargs[attr] = value self.pb = ProgressBar(**kwargs) else: self.pb = DummyProgressBar() return self.pb def __exit__(self, exc_type, exc_val, exc_tb): self.pb.finish() class RawInput: def __init__( self, input_function: Optional[Callable] = None, printer: Optional[Callable] = None, **kwargs: Any, ) -> None: if "input" in kwargs: input_function = kwargs.pop("input") warnings.warn( "'input' argument is deprecated," "use 'input_function' instead", DeprecationWarning, ) self._input = input_function or input self._print = printer def ask(self, question: str, options: Tuple[str, ...], default: str) -> str: assert default in options choices = [] for option in options: if option == default: label = option[0].upper() else: label = option[0].lower() if len(option) > 1: label += f"({option[1:].lower()})" choices.append((option, label)) prompt = f"{question} [{'/'.join([opt[1] for opt in choices])}]: " tries = 3 while tries > 0: answer = self._input(prompt).strip().lower() if not answer: return default possible = [option for option, label in choices if option.lower().startswith(answer)] if len(possible) == 1: return possible[0] elif len(possible) == 0: msg = f"{answer} is not an option." else: msg = "%s is an ambiguous answer, do you mean %s ?" % ( answer, " or ".join(possible), ) if self._print: self._print(msg) else: print(msg) tries -= 1 raise Exception("unable to get a sensible answer") def confirm(self, question: str, default_is_yes: bool = True) -> bool: default = default_is_yes and "y" or "n" answer = self.ask(question, ("y", "n"), default) return answer == "y" ASK = RawInput() def getlogin(): """avoid using os.getlogin() because of strange tty / stdin problems (man 3 getlogin) Another solution would be to use $LOGNAME, $USER or $USERNAME """ if sys.platform != "win32": import pwd # Platforms: Unix return pwd.getpwuid(os.getuid())[0] else: return os.environ["USERNAME"] def generate_password(length=8, vocab=string.ascii_letters + string.digits): """dumb password generation function""" pwd = "" for i in range(length): pwd += random.choice(vocab) return pwd ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/sphinx_ext.py0000666000000000000000000000631014762603732020713 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . from logilab.common.decorators import monkeypatch from sphinx.ext import autodoc from sphinx.ext.autodoc import ( ViewList, Options, AutodocReporter, nodes, assemble_option_dict, nested_parse_with_titles, ) class DocstringOnlyModuleDocumenter(autodoc.ModuleDocumenter): objtype = "docstring" def format_signature(self): pass def add_directive_header(self, sig): pass def document_members(self, all_members=False): pass def resolve_name(self, modname, parents, path, base): if modname is not None: return modname, parents + [base] return (path or "") + base, [] # autodoc.add_documenter(DocstringOnlyModuleDocumenter) def setup(app): app.add_autodocumenter(DocstringOnlyModuleDocumenter) @monkeypatch(autodoc.AutoDirective) def run(self): self.filename_set = set() # a set of dependent filenames self.reporter = self.state.document.reporter self.env = self.state.document.settings.env self.warnings = [] self.result = ViewList() # find out what documenter to call objtype = self.name[4:] doc_class = self._registry[objtype] # process the options with the selected documenter's option_spec self.genopt = Options(assemble_option_dict(self.options.items(), doc_class.option_spec)) # generate the output documenter = doc_class(self, self.arguments[0]) documenter.generate(more_content=self.content) if not self.result: return self.warnings # record all filenames as dependencies -- this will at least # partially make automatic invalidation possible for fn in self.filename_set: self.env.note_dependency(fn) # use a custom reporter that correctly assigns lines to source # filename/description and lineno old_reporter = self.state.memo.reporter self.state.memo.reporter = AutodocReporter(self.result, self.state.memo.reporter) if self.name in ("automodule", "autodocstring"): node = nodes.section() # necessary so that the child nodes get the right source/line set node.document = self.state.document nested_parse_with_titles(self.state, self.result, node) else: node = nodes.paragraph() node.document = self.state.document self.state.nested_parse(self.result, 0, node) self.state.memo.reporter = old_reporter return self.warnings + node.children ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/sphinxutils.py0000666000000000000000000001014414762603732021114 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Sphinx utils ModuleGenerator: Generate a file that lists all the modules of a list of packages in order to pull all the docstring. This should not be used in a makefile to systematically generate sphinx documentation! Typical usage: >>> from logilab.common.sphinxutils import ModuleGenerator >>> mgen = ModuleGenerator('logilab common', '/home/adim/src/logilab/common') >>> mgen.generate('api_logilab_common.rst', exclude_dirs=('test',)) """ import sys import os.path as osp import inspect from logilab.common import STD_BLACKLIST from logilab.common.shellutils import globfind from logilab.common.modutils import modpath_from_file def module_members(module): members = [] for name, value in inspect.getmembers(module): if getattr(value, "__module__", None) == module.__name__: members.append((name, value)) return sorted(members) def class_members(klass): return sorted( [ name for name in vars(klass) if name not in ("__doc__", "__module__", "__dict__", "__weakref__") ] ) class ModuleGenerator: file_header = """.. -*- coding: utf-8 -*-\n\n%s\n""" module_def = """ :mod:`%s` =======%s .. automodule:: %s :members: %s """ class_def = """ .. autoclass:: %s :members: %s """ def __init__(self, project_title, code_dir): self.title = project_title self.code_dir = osp.abspath(code_dir) def generate(self, dest_file, exclude_dirs=STD_BLACKLIST): """make the module file""" self.fn = open(dest_file, "w") num = len(self.title) + 6 title = "=" * num + f"\n {self.title} API\n" + "=" * num self.fn.write(self.file_header % title) self.gen_modules(exclude_dirs=exclude_dirs) self.fn.close() def gen_modules(self, exclude_dirs): """generate all modules""" for module in self.find_modules(exclude_dirs): modname = module.__name__ classes = [] modmembers = [] for objname, obj in module_members(module): if inspect.isclass(obj): classmembers = class_members(obj) classes.append((objname, classmembers)) else: modmembers.append(objname) self.fn.write( self.module_def % (modname, "=" * len(modname), modname, ", ".join(modmembers)) ) for klass, members in classes: self.fn.write(self.class_def % (klass, ", ".join(members))) def find_modules(self, exclude_dirs): basepath = osp.dirname(self.code_dir) basedir = osp.basename(basepath) + osp.sep if basedir not in sys.path: sys.path.insert(1, basedir) for filepath in globfind(self.code_dir, "*.py", exclude_dirs): if osp.basename(filepath) in ("setup.py", "__pkginfo__.py"): continue dotted_path = modpath_from_file(filepath) module = type(".".join(dotted_path), (), {}) # mock it yield module if __name__ == "__main__": # example : title, code_dir, outfile = sys.argv[1:] generator = ModuleGenerator(title, code_dir) # XXX modnames = ['logilab'] generator.generate(outfile, ("test", "tests", "examples", "data", "doc", ".hg", "migration")) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/table.py0000666000000000000000000010141614762603732017614 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Table management module.""" from types import CodeType from typing import Any, List, Optional, Tuple, Union, Dict, Iterator from _io import StringIO import re __docformat__ = "restructuredtext en" class Table: """Table defines a data table with column and row names. inv:: len(self.data) <= len(self.row_names) forall(self.data, lambda x: len(x) <= len(self.col_names)) """ def __init__( self, default_value: int = 0, col_names: Optional[List[str]] = None, row_names: Optional[Any] = None, ) -> None: self.col_names: List = [] self.row_names: List = [] self.data: List = [] self.default_value: int = default_value if col_names: self.create_columns(col_names) if row_names: self.create_rows(row_names) def _next_row_name(self) -> str: return f"row{len(self.row_names) + 1}" def __iter__(self) -> Iterator: return iter(self.data) # def __eq__(self, other: Union[List[List[int]], List[Tuple[str, str, str, float]]]) -> bool: def __eq__(self, other: object) -> bool: def is_iterable(variable: Any) -> bool: try: iter(variable) except TypeError: return False else: return True if other is None: return False elif is_iterable(other): # mypy: No overload variant of "list" matches argument type "object" # checked before return list(self) == list(other) # type: ignore else: return False __hash__ = object.__hash__ def __ne__(self, other): return not self == other def __len__(self) -> int: return len(self.row_names) # Rows / Columns creation ################################################# def create_rows(self, row_names: List[str]) -> None: """Appends row_names to the list of existing rows""" self.row_names.extend(row_names) for row_name in row_names: self.data.append([self.default_value] * len(self.col_names)) def create_columns(self, col_names: List[str]) -> None: """Appends col_names to the list of existing columns""" for col_name in col_names: self.create_column(col_name) def create_row(self, row_name: str = None) -> None: """Creates a rowname to the row_names list""" row_name = row_name or self._next_row_name() self.row_names.append(row_name) self.data.append([self.default_value] * len(self.col_names)) def create_column(self, col_name: str) -> None: """Creates a colname to the col_names list""" self.col_names.append(col_name) for row in self.data: row.append(self.default_value) # Sort by column ########################################################## def sort_by_column_id(self, col_id: str, method: str = "asc") -> None: """Sorts the table (in-place) according to data stored in col_id""" try: col_index = self.col_names.index(col_id) self.sort_by_column_index(col_index, method) except ValueError: raise KeyError(f"Col ({col_id}) not found in table") def sort_by_column_index(self, col_index: int, method: str = "asc") -> None: """Sorts the table 'in-place' according to data stored in col_index method should be in ('asc', 'desc') """ sort_list = sorted( [(row[col_index], row, row_name) for row, row_name in zip(self.data, self.row_names)] ) # Sorting sort_list will sort according to col_index # If we want reverse sort, then reverse list if method.lower() == "desc": sort_list.reverse() # Rebuild data / row names self.data = [] self.row_names = [] for val, row, row_name in sort_list: self.data.append(row) self.row_names.append(row_name) def groupby( self, colname: str, *others: str ) -> Union[Dict[str, Dict[str, "Table"]], Dict[str, "Table"]]: """builds indexes of data :returns: nested dictionaries pointing to actual rows """ groups: Dict = {} colnames = (colname,) + others col_indexes = [self.col_names.index(col_id) for col_id in colnames] for row in self.data: ptr = groups for col_index in col_indexes[:-1]: ptr = ptr.setdefault(row[col_index], {}) table = ptr.setdefault( row[col_indexes[-1]], Table(default_value=self.default_value, col_names=self.col_names), ) table.append_row(tuple(row)) return groups def select(self, colname: str, value: str) -> "Table": grouped = self.groupby(colname) try: # mypy: Incompatible return value type (got "Union[Dict[str, Table], Table]", # mypy: expected "Table") # I guess we are sure we'll get a Table here? return grouped[value] # type: ignore except KeyError: return Table() def remove(self, colname, value): col_index = self.col_names.index(colname) for row in self.data[:]: if row[col_index] == value: self.data.remove(row) # The 'setter' part ####################################################### def set_cell(self, row_index: int, col_index: int, data: int) -> None: """sets value of cell 'row_indew', 'col_index' to data""" self.data[row_index][col_index] = data def set_cell_by_ids(self, row_id: str, col_id: str, data: Union[int, str]) -> None: """sets value of cell mapped by row_id and col_id to data Raises a KeyError if row_id or col_id are not found in the table """ try: row_index = self.row_names.index(row_id) except ValueError: raise KeyError(f"Row ({row_id}) not found in table") else: try: col_index = self.col_names.index(col_id) self.data[row_index][col_index] = data except ValueError: raise KeyError(f"Column ({col_id}) not found in table") def set_row(self, row_index: int, row_data: Union[List[float], List[int], List[str]]) -> None: """sets the 'row_index' row pre:: type(row_data) == types.ListType len(row_data) == len(self.col_names) """ self.data[row_index] = row_data def set_row_by_id(self, row_id: str, row_data: List[str]) -> None: """sets the 'row_id' column pre:: type(row_data) == types.ListType len(row_data) == len(self.row_names) Raises a KeyError if row_id is not found """ try: row_index = self.row_names.index(row_id) self.set_row(row_index, row_data) except ValueError: raise KeyError(f"Row ({row_id}) not found in table") def append_row( self, row_data: Union[List[Union[float, str]], List[int]], row_name: Optional[str] = None ) -> int: """Appends a row to the table pre:: type(row_data) == types.ListType len(row_data) == len(self.col_names) """ row_name = row_name or self._next_row_name() self.row_names.append(row_name) self.data.append(row_data) return len(self.data) - 1 def insert_row(self, index: int, row_data: List[str], row_name: str = None) -> None: """Appends row_data before 'index' in the table. To make 'insert' behave like 'list.insert', inserting in an out of range index will insert row_data to the end of the list pre:: type(row_data) == types.ListType len(row_data) == len(self.col_names) """ row_name = row_name or self._next_row_name() self.row_names.insert(index, row_name) self.data.insert(index, row_data) def delete_row(self, index: int) -> List[str]: """Deletes the 'index' row in the table, and returns it. Raises an IndexError if index is out of range """ self.row_names.pop(index) return self.data.pop(index) def delete_row_by_id(self, row_id: str) -> None: """Deletes the 'row_id' row in the table. Raises a KeyError if row_id was not found. """ try: row_index = self.row_names.index(row_id) self.delete_row(row_index) except ValueError: raise KeyError(f"Row ({row_id}) not found in table") def set_column(self, col_index: int, col_data: Union[List[int], range]) -> None: """sets the 'col_index' column pre:: type(col_data) == types.ListType len(col_data) == len(self.row_names) """ for row_index, cell_data in enumerate(col_data): self.data[row_index][col_index] = cell_data def set_column_by_id(self, col_id: str, col_data: Union[List[int], range]) -> None: """sets the 'col_id' column pre:: type(col_data) == types.ListType len(col_data) == len(self.col_names) Raises a KeyError if col_id is not found """ try: col_index = self.col_names.index(col_id) self.set_column(col_index, col_data) except ValueError: raise KeyError(f"Column ({col_id}) not found in table") def append_column(self, col_data: range, col_name: str) -> None: """Appends the 'col_index' column pre:: type(col_data) == types.ListType len(col_data) == len(self.row_names) """ self.col_names.append(col_name) for row_index, cell_data in enumerate(col_data): self.data[row_index].append(cell_data) def insert_column(self, index: int, col_data: range, col_name: str) -> None: """Appends col_data before 'index' in the table. To make 'insert' behave like 'list.insert', inserting in an out of range index will insert col_data to the end of the list pre:: type(col_data) == types.ListType len(col_data) == len(self.row_names) """ self.col_names.insert(index, col_name) for row_index, cell_data in enumerate(col_data): self.data[row_index].insert(index, cell_data) def delete_column(self, index: int) -> List[int]: """Deletes the 'index' column in the table, and returns it. Raises an IndexError if index is out of range """ self.col_names.pop(index) return [row.pop(index) for row in self.data] def delete_column_by_id(self, col_id: str) -> None: """Deletes the 'col_id' col in the table. Raises a KeyError if col_id was not found. """ try: col_index = self.col_names.index(col_id) self.delete_column(col_index) except ValueError: raise KeyError(f"Column ({col_id}) not found in table") # The 'getter' part ####################################################### def get_shape(self) -> Tuple[int, int]: """Returns a tuple which represents the table's shape""" return len(self.row_names), len(self.col_names) shape = property(get_shape) def __getitem__( self, indices: Union[Tuple[Union[int, slice, str], Union[int, str]], int, slice] ) -> Any: """provided for convenience""" multirows: bool = False multicols: bool = False rows: slice cols: slice rows_indice: Union[int, slice, str] cols_indice: Union[int, str, None] = None if isinstance(indices, tuple): rows_indice = indices[0] if len(indices) > 1: cols_indice = indices[1] else: rows_indice = indices # define row slice if isinstance(rows_indice, str): try: rows_indice = self.row_names.index(rows_indice) except ValueError: raise KeyError(f"Row ({rows_indice}) not found in table") if isinstance(rows_indice, int): rows = slice(rows_indice, rows_indice + 1) multirows = False else: rows = slice(None) multirows = True # define col slice if isinstance(cols_indice, str): try: cols_indice = self.col_names.index(cols_indice) except ValueError: raise KeyError(f"Column ({cols_indice}) not found in table") if isinstance(cols_indice, int): cols = slice(cols_indice, cols_indice + 1) multicols = False else: cols = slice(None) multicols = True # get sub-table tab = Table() tab.default_value = self.default_value tab.create_rows(self.row_names[rows]) tab.create_columns(self.col_names[cols]) for idx, row in enumerate(self.data[rows]): tab.set_row(idx, row[cols]) if multirows: if multicols: return tab else: return [item[0] for item in tab.data] else: if multicols: return tab.data[0] else: return tab.data[0][0] def get_cell_by_ids(self, row_id, col_id): """Returns the element at [row_id][col_id]""" try: row_index = self.row_names.index(row_id) except ValueError: raise KeyError(f"Row ({row_id}) not found in table") else: try: col_index = self.col_names.index(col_id) except ValueError: raise KeyError(f"Column ({col_id}) not found in table") return self.data[row_index][col_index] def get_row_by_id(self, row_id): """Returns the 'row_id' row""" try: row_index = self.row_names.index(row_id) except ValueError: raise KeyError(f"Row ({row_id}) not found in table") return self.data[row_index] def get_column_by_id(self, col_id, distinct=False): """Returns the 'col_id' col""" try: col_index = self.col_names.index(col_id) except ValueError: raise KeyError(f"Column ({col_id}) not found in table") return self.get_column(col_index, distinct) def get_columns(self) -> List[List[int]]: """Returns all the columns in the table""" return [self[:, index] for index in range(len(self.col_names))] def get_column(self, col_index, distinct=False): """get a column by index""" col = [row[col_index] for row in self.data] if distinct: col = list(set(col)) return col def apply_stylesheet(self, stylesheet: "TableStyleSheet") -> None: """Applies the stylesheet to this table""" for instruction in stylesheet.instructions: eval(instruction) def transpose(self) -> "Table": """Keeps the self object intact, and returns the transposed (rotated) table. """ transposed = Table() transposed.create_rows(self.col_names) transposed.create_columns(self.row_names) for col_index, column in enumerate(self.get_columns()): transposed.set_row(col_index, column) return transposed def pprint(self) -> str: """returns a string representing the table in a pretty printed 'text' format. """ # The maximum row name (to know the start_index of the first col) max_row_name = 0 for row_name in self.row_names: if len(row_name) > max_row_name: max_row_name = len(row_name) col_start = max_row_name + 5 lines = [] # Build the 'first' line <=> the col_names one # The first cell <=> an empty one col_names_line = [" " * col_start] for col_name in self.col_names: col_names_line.append(col_name + " " * 5) lines.append("|" + "|".join(col_names_line) + "|") max_line_length = len(lines[0]) # Build the table for row_index, row in enumerate(self.data): line = [] # First, build the row_name's cell row_name = self.row_names[row_index] line.append(row_name + " " * (col_start - len(row_name))) # Then, build all the table's cell for this line. for col_index, cell in enumerate(row): col_name_length = len(self.col_names[col_index]) + 5 data = str(cell) line.append(data + " " * (col_name_length - len(data))) lines.append("|" + "|".join(line) + "|") if len(lines[-1]) > max_line_length: max_line_length = len(lines[-1]) # Wrap the table with '-' to make a frame lines.insert(0, "-" * max_line_length) lines.append("-" * max_line_length) return "\n".join(lines) def __repr__(self) -> str: return repr(self.data) def as_text(self): data = [] # We must convert cells into strings before joining them for row in self.data: data.append([str(cell) for cell in row]) lines = ["\t".join(row) for row in data] return "\n".join(lines) class TableStyle: """Defines a table's style""" def __init__(self, table: Table) -> None: self._table = table self.size = dict([(col_name, "1*") for col_name in table.col_names]) # __row_column__ is a special key to define the first column which # actually has no name (<=> left most column <=> row names column) self.size["__row_column__"] = "1*" self.alignment = dict([(col_name, "right") for col_name in table.col_names]) self.alignment["__row_column__"] = "right" # We shouldn't have to create an entry for # the 1st col (the row_column one) self.units = dict([(col_name, "") for col_name in table.col_names]) self.units["__row_column__"] = "" # XXX FIXME : params order should be reversed for all set() methods def set_size(self, value: str, col_id: str) -> None: """sets the size of the specified col_id to value""" self.size[col_id] = value def set_size_by_index(self, value: str, col_index: int) -> None: """Allows to set the size according to the column index rather than using the column's id. BE CAREFUL : the '0' column is the '__row_column__' one ! """ if col_index == 0: col_id = "__row_column__" else: col_id = self._table.col_names[col_index - 1] self.size[col_id] = value def set_alignment(self, value: str, col_id: str) -> None: """sets the alignment of the specified col_id to value""" self.alignment[col_id] = value def set_alignment_by_index(self, value: str, col_index: int) -> None: """Allows to set the alignment according to the column index rather than using the column's id. BE CAREFUL : the '0' column is the '__row_column__' one ! """ if col_index == 0: col_id = "__row_column__" else: col_id = self._table.col_names[col_index - 1] self.alignment[col_id] = value def set_unit(self, value: str, col_id: str) -> None: """sets the unit of the specified col_id to value""" self.units[col_id] = value def set_unit_by_index(self, value: str, col_index: int) -> None: """Allows to set the unit according to the column index rather than using the column's id. BE CAREFUL : the '0' column is the '__row_column__' one ! (Note that in the 'unit' case, you shouldn't have to set a unit for the 1st column (the __row__column__ one)) """ if col_index == 0: col_id = "__row_column__" else: col_id = self._table.col_names[col_index - 1] self.units[col_id] = value def get_size(self, col_id: str) -> str: """Returns the size of the specified col_id""" return self.size[col_id] def get_size_by_index(self, col_index: int) -> str: """Allows to get the size according to the column index rather than using the column's id. BE CAREFUL : the '0' column is the '__row_column__' one ! """ if col_index == 0: col_id = "__row_column__" else: col_id = self._table.col_names[col_index - 1] return self.size[col_id] def get_alignment(self, col_id: str) -> str: """Returns the alignment of the specified col_id""" return self.alignment[col_id] def get_alignment_by_index(self, col_index: int) -> str: """Allors to get the alignment according to the column index rather than using the column's id. BE CAREFUL : the '0' column is the '__row_column__' one ! """ if col_index == 0: col_id = "__row_column__" else: col_id = self._table.col_names[col_index - 1] return self.alignment[col_id] def get_unit(self, col_id: str) -> str: """Returns the unit of the specified col_id""" return self.units[col_id] def get_unit_by_index(self, col_index: int) -> str: """Allors to get the unit according to the column index rather than using the column's id. BE CAREFUL : the '0' column is the '__row_column__' one ! """ if col_index == 0: col_id = "__row_column__" else: col_id = self._table.col_names[col_index - 1] return self.units[col_id] CELL_PROG = re.compile("([0-9]+)_([0-9]+)") class TableStyleSheet: """A simple Table stylesheet Rules are expressions where cells are defined by the row_index and col_index separated by an underscore ('_'). For example, suppose you want to say that the (2,5) cell must be the sum of its two preceding cells in the row, you would create the following rule :: 2_5 = 2_3 + 2_4 You can also use all the math.* operations you want. For example:: 2_5 = sqrt(2_3**2 + 2_4**2) """ def __init__(self, rules: Optional[List[str]] = None) -> None: rules = rules or [] self.rules: List[str] = [] self.instructions: List[CodeType] = [] for rule in rules: self.add_rule(rule) def add_rule(self, rule: str) -> None: """Adds a rule to the stylesheet rules""" try: source_code = ["from math import *"] source_code.append(CELL_PROG.sub(r"self.data[\1][\2]", rule)) self.instructions.append(compile("\n".join(source_code), "table.py", "exec")) self.rules.append(rule) except SyntaxError: print(f"Bad Stylesheet Rule : {rule} [skipped]") def add_rowsum_rule( self, dest_cell: Tuple[int, int], row_index: int, start_col: int, end_col: int ) -> None: """Creates and adds a rule to sum over the row at row_index from start_col to end_col. dest_cell is a tuple of two elements (x,y) of the destination cell No check is done for indexes ranges. pre:: start_col >= 0 end_col > start_col """ cell_list = ["%d_%d" % (row_index, index) for index in range(start_col, end_col + 1)] rule = "%d_%d=" % dest_cell + "+".join(cell_list) self.add_rule(rule) def add_rowavg_rule( self, dest_cell: Tuple[int, int], row_index: int, start_col: int, end_col: int ) -> None: """Creates and adds a rule to make the row average (from start_col to end_col) dest_cell is a tuple of two elements (x,y) of the destination cell No check is done for indexes ranges. pre:: start_col >= 0 end_col > start_col """ cell_list = ["%d_%d" % (row_index, index) for index in range(start_col, end_col + 1)] num = end_col - start_col + 1 rule = "%d_%d=" % dest_cell + "(" + "+".join(cell_list) + f")/{num:f}" self.add_rule(rule) def add_colsum_rule( self, dest_cell: Tuple[int, int], col_index: int, start_row: int, end_row: int ) -> None: """Creates and adds a rule to sum over the col at col_index from start_row to end_row. dest_cell is a tuple of two elements (x,y) of the destination cell No check is done for indexes ranges. pre:: start_row >= 0 end_row > start_row """ cell_list = ["%d_%d" % (index, col_index) for index in range(start_row, end_row + 1)] rule = "%d_%d=" % dest_cell + "+".join(cell_list) self.add_rule(rule) def add_colavg_rule( self, dest_cell: Tuple[int, int], col_index: int, start_row: int, end_row: int ) -> None: """Creates and adds a rule to make the col average (from start_row to end_row) dest_cell is a tuple of two elements (x,y) of the destination cell No check is done for indexes ranges. pre:: start_row >= 0 end_row > start_row """ cell_list = ["%d_%d" % (index, col_index) for index in range(start_row, end_row + 1)] num = end_row - start_row + 1 rule = "%d_%d=" % dest_cell + "(" + "+".join(cell_list) + f")/{num:f}" self.add_rule(rule) class TableCellRenderer: """Defines a simple text renderer""" def __init__(self, **properties: Any) -> None: """keywords should be properties with an associated boolean as value. For example : renderer = TableCellRenderer(units = True, alignment = False) An unspecified property will have a 'False' value by default. Possible properties are : alignment, unit """ self.properties = properties def render_cell( self, cell_coord: Tuple[int, int], table: Table, table_style: TableStyle ) -> Union[str, int]: """Renders the cell at 'cell_coord' in the table, using table_style""" row_index, col_index = cell_coord cell_value = table.data[row_index][col_index] final_content = self._make_cell_content(cell_value, table_style, col_index + 1) return self._render_cell_content(final_content, table_style, col_index + 1) def render_row_cell( self, row_name: str, table: Table, table_style: TableStyle ) -> Union[str, int]: """Renders the cell for 'row_id' row""" cell_value = row_name return self._render_cell_content(cell_value, table_style, 0) def render_col_cell( self, col_name: str, table: Table, table_style: TableStyle ) -> Union[str, int]: """Renders the cell for 'col_id' row""" cell_value = col_name col_index = table.col_names.index(col_name) return self._render_cell_content(cell_value, table_style, col_index + 1) def _render_cell_content( self, content: Union[str, int], table_style: TableStyle, col_index: int ) -> Union[str, int]: """Makes the appropriate rendering for this cell content. Rendering properties will be searched using the *table_style.get_xxx_by_index(col_index)' methods **This method should be overridden in the derived renderer classes.** """ return content def _make_cell_content( self, cell_content: int, table_style: TableStyle, col_index: int ) -> Union[int, str]: """Makes the cell content (adds decoration data, like units for example) """ final_content: Union[int, str] = cell_content if "skip_zero" in self.properties: replacement_char = self.properties["skip_zero"] else: replacement_char = 0 if replacement_char and final_content == 0: return replacement_char try: units_on = self.properties["units"] if units_on: final_content = self._add_unit(cell_content, table_style, col_index) except KeyError: pass return final_content def _add_unit(self, cell_content: int, table_style: TableStyle, col_index: int) -> str: """Adds unit to the cell_content if needed""" unit = table_style.get_unit_by_index(col_index) return str(cell_content) + " " + unit class DocbookRenderer(TableCellRenderer): """Defines how to render a cell for a docboook table""" def define_col_header(self, col_index: int, table_style: TableStyle) -> str: """Computes the colspec element according to the style""" size = table_style.get_size_by_index(col_index) return '\n' % (col_index, size) def _render_cell_content( self, cell_content: Union[int, str], table_style: TableStyle, col_index: int ) -> str: """Makes the appropriate rendering for this cell content. Rendering properties will be searched using the table_style.get_xxx_by_index(col_index)' methods. """ try: align_on = self.properties["alignment"] alignment = table_style.get_alignment_by_index(col_index) if align_on: return f"{cell_content}\n" except KeyError: # KeyError <=> Default alignment return f"{cell_content}\n" # XXX really? return "" class TableWriter: """A class to write tables""" def __init__( self, stream: StringIO, table: Table, style: Optional[Any], **properties: Any ) -> None: self._stream = stream self.style = style or TableStyle(table) self._table = table self.properties = properties self.renderer: Optional[DocbookRenderer] = None def set_style(self, style): """sets the table's associated style""" self.style = style def set_renderer(self, renderer: DocbookRenderer) -> None: """sets the way to render cell""" self.renderer = renderer def update_properties(self, **properties): """Updates writer's properties (for cell rendering)""" self.properties.update(properties) def write_table(self, title: str = "") -> None: """Writes the table""" raise NotImplementedError("write_table must be implemented !") class DocbookTableWriter(TableWriter): """Defines an implementation of TableWriter to write a table in Docbook""" def _write_headers(self) -> None: """Writes col headers""" assert self.renderer is not None # Define col_headers (colstpec elements) for col_index in range(len(self._table.col_names) + 1): self._stream.write(self.renderer.define_col_header(col_index, self.style)) self._stream.write("\n\n") # XXX FIXME : write an empty entry <=> the first (__row_column) column self._stream.write("\n") for col_name in self._table.col_names: self._stream.write(self.renderer.render_col_cell(col_name, self._table, self.style)) self._stream.write("\n\n") def _write_body(self) -> None: """Writes the table body""" assert self.renderer is not None self._stream.write("\n") for row_index, row in enumerate(self._table.data): self._stream.write("\n") row_name = self._table.row_names[row_index] # Write the first entry (row_name) self._stream.write(self.renderer.render_row_cell(row_name, self._table, self.style)) for col_index, cell in enumerate(row): self._stream.write( self.renderer.render_cell((row_index, col_index), self._table, self.style) ) self._stream.write("\n") self._stream.write("\n") def write_table(self, title: str = "") -> None: """Writes the table""" self._stream.write(f"\n{title}>\n") self._stream.write( '\n' % (len(self._table.col_names) + 1) ) self._write_headers() self._write_body() self._stream.write("\n
\n") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/tasksqueue.py0000666000000000000000000000606114762603732020717 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Prioritized tasks queue""" __docformat__ = "restructuredtext en" from typing import Iterator, List from bisect import insort_left import queue LOW = 0 MEDIUM = 10 HIGH = 100 PRIORITY = { "LOW": LOW, "MEDIUM": MEDIUM, "HIGH": HIGH, } REVERSE_PRIORITY = dict((values, key) for key, values in PRIORITY.items()) class PrioritizedTasksQueue(queue.Queue): def _init(self, maxsize: int) -> None: """Initialize the queue representation""" self.maxsize = maxsize # ordered list of task, from the lowest to the highest priority self.queue: List["Task"] = [] # type: ignore def _put(self, item: "Task") -> None: """Put a new item in the queue""" for i, task in enumerate(self.queue): # equivalent task if task == item: # if new task has a higher priority, remove the one already # queued so the new priority will be considered if task < item: item.merge(task) del self.queue[i] break # else keep it so current order is kept task.merge(item) return insort_left(self.queue, item) def _get(self) -> "Task": """Get an item from the queue""" return self.queue.pop() def __iter__(self) -> Iterator["Task"]: return iter(self.queue) def remove(self, tid: str) -> None: """remove a specific task from the queue""" # XXX acquire lock for i, task in enumerate(self): if task.id == tid: self.queue.pop(i) return raise ValueError(f"not task of id {tid} in queue") class Task: def __init__(self, tid: str, priority: int = LOW) -> None: # task id self.id = tid # task priority self.priority = priority def __repr__(self) -> str: return "" % (self.id, id(self)) def __lt__(self, other: "Task") -> bool: return self.priority < other.priority def __eq__(self, other: object) -> bool: return isinstance(other, type(self)) and self.id == other.id __hash__ = object.__hash__ def merge(self, other: "Task") -> None: pass ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/testlib.py0000666000000000000000000010662414762603732020201 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Run tests. This will find all modules whose name match a given prefix in the test directory, and run them. Various command line options provide additional facilities. Command line options: -v verbose -- run tests in verbose mode with output to stdout -q quiet -- don't print anything except if a test fails -t testdir -- directory where the tests will be found -x exclude -- add a test to exclude -p profile -- profiled execution -d dbc -- enable design-by-contract -m match -- only run test matching the tag pattern which follow If no non-option arguments are present, prefixes used are 'test', 'regrtest', 'smoketest' and 'unittest'. """ __docformat__ = "restructuredtext en" # modified copy of some functions from test/regrtest.py from PyXml # disable camel case warning # pylint: disable=C0103 import sys import os import os.path as osp import types import doctest import inspect import unittest import traceback import tempfile import warnings from inspect import isgeneratorfunction, isclass, FrameInfo from functools import wraps from itertools import dropwhile from contextlib import contextmanager from typing import Any, Iterator, Union, Optional, Callable, Dict, List, Tuple, Generator from mypy_extensions import NoReturn from logilab.common import textutils from logilab.common.debugger import Debugger, colorize_source from logilab.common.decorators import cached, classproperty __all__ = ["unittest_main", "find_tests", "nocoverage", "pause_trace"] DEFAULT_PREFIXES = ("test", "regrtest", "smoketest", "unittest", "func", "validation") # used by unittest to count the number of relevant levels in the traceback __unittest = 1 def in_tempdir(callable): """A decorator moving the enclosed function inside the tempfile.tempfdir""" @wraps(callable) def proxy(*args, **kargs): old_cwd = os.getcwd() os.chdir(tempfile.tempdir) try: return callable(*args, **kargs) finally: os.chdir(old_cwd) return proxy def find_tests(testdir, prefixes=DEFAULT_PREFIXES, suffix=".py", excludes=(), remove_suffix=True): """ Return a list of all applicable test modules. """ tests = [] for name in os.listdir(testdir): if not suffix or name.endswith(suffix): for prefix in prefixes: if name.startswith(prefix): if remove_suffix and name.endswith(suffix): name = name[: -len(suffix)] if name not in excludes: tests.append(name) tests.sort() return tests # PostMortem Debug facilities ##### def start_interactive_mode(result): """starts an interactive shell so that the user can inspect errors""" debuggers = result.debuggers descrs = result.error_descrs + result.fail_descrs if len(debuggers) == 1: # don't ask for test name if there's only one failure debuggers[0].start() else: while True: testindex = 0 print("Choose a test to debug:") # order debuggers in the same way than errors were printed print("\n".join([f"\t{i} : {descr}" for i, (_, descr) in enumerate(descrs)])) print("Type 'exit' (or ^D) to quit") print() try: todebug = input("Enter a test name: ") if todebug.strip().lower() == "exit": print() break else: try: testindex = int(todebug) debugger = debuggers[descrs[testindex][0]] except (ValueError, IndexError): print(f"ERROR: invalid test number {todebug!r}") else: debugger.start() except (EOFError, KeyboardInterrupt): print() break # coverage pausing tools ##################################################### @contextmanager def replace_trace(trace: Optional[Callable] = None) -> Iterator: """A context manager that temporary replaces the trace function""" oldtrace = sys.gettrace() sys.settrace(trace) try: yield finally: # specific hack to work around a bug in pycoverage, see # https://bitbucket.org/ned/coveragepy/issue/123 if oldtrace is not None and not callable(oldtrace) and hasattr(oldtrace, "pytrace"): oldtrace = oldtrace.pytrace sys.settrace(oldtrace) pause_trace = replace_trace def nocoverage(func: Callable) -> Callable: """Function decorator that pauses tracing functions""" if hasattr(func, "uncovered"): return func # mypy: "Callable[..., Any]" has no attribute "uncovered" # dynamic attribute for magic func.uncovered = True # type: ignore def not_covered(*args: Any, **kwargs: Any) -> Any: with pause_trace(): return func(*args, **kwargs) # mypy: "Callable[[VarArg(Any), KwArg(Any)], NoReturn]" has no attribute "uncovered" # dynamic attribute for magic not_covered.uncovered = True # type: ignore return not_covered # test utils ################################################################## # Add deprecation warnings about new api used by module level fixtures in unittest2 # http://www.voidspace.org.uk/python/articles/unittest2.shtml#setupmodule-and-teardownmodule class _DebugResult(object): # simplify import statement among unittest flavors.. "Used by the TestSuite to hold previous class when running in debug." _previousTestClass = None _moduleSetUpFailed = False shouldStop = False # backward compatibility: TestSuite might be imported from lgc.testlib TestSuite = unittest.TestSuite class keywords(dict): """Keyword args (**kwargs) support for generative tests.""" class starargs(tuple): """Variable arguments (*args) for generative tests.""" def __new__(cls, *args): return tuple.__new__(cls, args) unittest_main = unittest.main class InnerTestSkipped(unittest.SkipTest): """raised when a test is skipped""" def parse_generative_args(params: Tuple[int, ...]) -> Tuple[Union[List[bool], List[int]], Dict]: args = [] varargs = () kwargs: Dict = {} flags = 0 # 2 <=> starargs, 4 <=> kwargs for param in params: if isinstance(param, starargs): varargs = param if flags: raise TypeError("found starargs after keywords !") flags |= 2 args += list(varargs) elif isinstance(param, keywords): kwargs = param if flags & 4: raise TypeError("got multiple keywords parameters") flags |= 4 elif flags & 2 or flags & 4: raise TypeError("found parameters after kwargs or args") else: args.append(param) return args, kwargs class InnerTest(tuple): def __new__(cls, name, *data): instance = tuple.__new__(cls, data) instance.name = name return instance class Tags(set): """A set of tag able validate an expression""" def __init__(self, *tags: str, **kwargs: Any) -> None: self.inherit = kwargs.pop("inherit", True) if kwargs: raise TypeError(f"{kwargs.keys()} are an invalid keyword argument for this function") if len(tags) == 1 and not isinstance(tags[0], str): tags = tags[0] super(Tags, self).__init__(tags) def __getitem__(self, key: str) -> bool: return key in self def match(self, exp: str) -> bool: # mypy: Argument 3 to "eval" has incompatible type "Tags"; # mypy: expected "Optional[Mapping[str, Any]]" # I'm really not sure here? return eval(exp, {}, self) # type: ignore # mypy: Argument 1 of "__or__" is incompatible with supertype "AbstractSet"; # mypy: supertype defines the argument type as "AbstractSet[_T]" # not sure how to fix this one def __or__(self, other: "Tags") -> "Tags": # type: ignore return Tags(*super(Tags, self).__or__(other)) # duplicate definition from unittest2 of the _deprecate decorator def _deprecate(original_func): def deprecated_func(*args, **kwargs): warnings.warn(f"Please use {original_func.__name__} instead.", DeprecationWarning, 2) return original_func(*args, **kwargs) return deprecated_func class TestCase(unittest.TestCase): """A unittest.TestCase extension with some additional methods.""" maxDiff = None tags = Tags() def __init__(self, methodName: str = "runTest") -> None: super(TestCase, self).__init__(methodName) self.__exc_info = sys.exc_info self.__testMethodName = self._testMethodName self._current_test_descr = None self._options_ = None @classproperty @cached def datadir(cls) -> str: # pylint: disable=E0213 """helper attribute holding the standard test's data directory NOTE: this is a logilab's standard """ mod = sys.modules[cls.__module__] return osp.join(osp.dirname(osp.abspath(mod.__file__)), "data") # cache it (use a class method to cache on class since TestCase is # instantiated for each test run) @classmethod def datapath(cls, *fname: str) -> str: """joins the object's datadir and `fname`""" return osp.join(cls.datadir, *fname) def set_description(self, descr): """sets the current test's description. This can be useful for generative tests because it allows to specify a description per yield """ self._current_test_descr = descr # override default's unittest.py feature def shortDescription(self) -> Optional[Any]: """override default unittest shortDescription to handle correctly generative tests """ if self._current_test_descr is not None: return self._current_test_descr return super(TestCase, self).shortDescription() def quiet_run(self, result: Any, func: Callable, *args: Any, **kwargs: Any) -> bool: try: func(*args, **kwargs) except (KeyboardInterrupt, SystemExit): raise except unittest.SkipTest as e: if hasattr(result, "addSkip"): result.addSkip(self, str(e)) else: warnings.warn( "TestResult has no addSkip method, skips not reported", RuntimeWarning, 2 ) result.addSuccess(self) return False except Exception: result.addError(self, self.__exc_info()) return False return True def _get_test_method(self) -> Callable: """return the test method""" return getattr(self, self._testMethodName) def optval(self, option, default=None): """return the option value or default if the option is not define""" return getattr(self._options_, option, default) def __call__(self, result=None, runcondition=None, options=None): """rewrite TestCase.__call__ to support generative tests This is mostly a copy/paste from unittest.py (i.e same variable names, same logic, except for the generative tests part) """ if result is None: result = self.defaultTestResult() self._options_ = options # if result.cvg: # result.cvg.start() testMethod = self._get_test_method() if getattr(self.__class__, "__unittest_skip__", False) or getattr( testMethod, "__unittest_skip__", False ): # If the class or method was skipped. try: skip_why = getattr(self.__class__, "__unittest_skip_why__", "") or getattr( testMethod, "__unittest_skip_why__", "" ) if hasattr(result, "addSkip"): result.addSkip(self, skip_why) else: warnings.warn( "TestResult has no addSkip method, skips not reported", RuntimeWarning, 2 ) result.addSuccess(self) finally: result.stopTest(self) return if runcondition and not runcondition(testMethod): return # test is skipped result.startTest(self) try: if not self.quiet_run(result, self.setUp): return generative = isgeneratorfunction(testMethod) # generative tests if generative: self._proceed_generative(result, testMethod, runcondition) else: status = self._proceed(result, testMethod) success = status == 0 if not self.quiet_run(result, self.tearDown): return if not generative and success: result.addSuccess(self) finally: # if result.cvg: # result.cvg.stop() result.stopTest(self) def _proceed_generative( self, result: Any, testfunc: Callable, runcondition: Callable = None ) -> bool: # cancel startTest()'s increment result.testsRun -= 1 success = True try: for params in testfunc(): if runcondition and not runcondition(testfunc, skipgenerator=False): if not (isinstance(params, InnerTest) and runcondition(params)): continue if not isinstance(params, (tuple, list)): params = (params,) func = params[0] args, kwargs = parse_generative_args(params[1:]) # increment test counter manually result.testsRun += 1 status = self._proceed(result, func, args, kwargs) if status == 0: result.addSuccess(self) success = True else: success = False # XXX Don't stop anymore if an error occured # if status == 2: # result.shouldStop = True if result.shouldStop: # either on error or on exitfirst + error break except self.failureException: result.addFailure(self, self.__exc_info()) success = False except unittest.SkipTest as e: result.addSkip(self, e) except Exception: # if an error occurs between two yield result.addError(self, self.__exc_info()) success = False return success def _proceed( self, result: Any, testfunc: Callable, args: Union[List[bool], List[int], Tuple[()]] = (), kwargs: Optional[Dict] = None, ) -> int: """proceed the actual test returns 0 on success, 1 on failure, 2 on error Note: addSuccess can't be called here because we have to wait for tearDown to be successfully executed to declare the test as successful """ kwargs = kwargs or {} try: testfunc(*args, **kwargs) except self.failureException: result.addFailure(self, self.__exc_info()) return 1 except KeyboardInterrupt: raise except InnerTestSkipped as e: result.addSkip(self, e) return 1 except unittest.SkipTest as e: result.addSkip(self, e) return 0 except Exception: result.addError(self, self.__exc_info()) return 2 return 0 def innerSkip(self, msg: str = None) -> NoReturn: """mark a generative test as skipped for the reason""" msg = msg or "test was skipped" raise InnerTestSkipped(msg) if sys.version_info >= (3, 2): assertItemsEqual = unittest.TestCase.assertCountEqual else: assertCountEqual = unittest.TestCase.assertItemsEqual class SkippedSuite(unittest.TestSuite): def test(self): """just there to trigger test execution""" self.skipped_test("doctest module has no DocTestSuite class") class DocTestFinder(doctest.DocTestFinder): def __init__(self, *args, **kwargs): self.skipped = kwargs.pop("skipped", ()) doctest.DocTestFinder.__init__(self, *args, **kwargs) def _get_test(self, obj, name, module, globs, source_lines): """override default _get_test method to be able to skip tests according to skipped attribute's value """ if getattr(obj, "__name__", "") in self.skipped: return None return doctest.DocTestFinder._get_test(self, obj, name, module, globs, source_lines) class MockConnection: """fake DB-API 2.0 connexion AND cursor (i.e. cursor() return self)""" def __init__(self, results): self.received = [] self.states = [] self.results = results def cursor(self): """Mock cursor method""" return self def execute(self, query, args=None): """Mock execute method""" self.received.append((query, args)) def fetchone(self): """Mock fetchone method""" return self.results[0] def fetchall(self): """Mock fetchall method""" return self.results def commit(self): """Mock commiy method""" self.states.append(("commit", len(self.received))) def rollback(self): """Mock rollback method""" self.states.append(("rollback", len(self.received))) def close(self): """Mock close method""" # mypy error: Name 'Mock' is not defined # dynamic class created by this class def mock_object(**params: Any) -> "Mock": # type: ignore # noqa """creates an object using params to set attributes >>> option = mock_object(verbose=False, index=range(5)) >>> option.verbose False >>> option.index [0, 1, 2, 3, 4] """ return type("Mock", (), params)() def create_files(paths: List[str], chroot: str) -> None: """Creates directories and files found in . :param paths: list of relative paths to files or directories :param chroot: the root directory in which paths will be created >>> from os.path import isdir, isfile >>> isdir('/tmp/a') False >>> create_files(['a/b/foo.py', 'a/b/c/', 'a/b/c/d/e.py'], '/tmp') >>> isdir('/tmp/a') True >>> isdir('/tmp/a/b/c') True >>> isfile('/tmp/a/b/c/d/e.py') True >>> isfile('/tmp/a/b/foo.py') True """ dirs, files = set(), set() for path in paths: path = osp.join(chroot, path) filename = osp.basename(path) # path is a directory path if filename == "": dirs.add(path) # path is a filename path else: dirs.add(osp.dirname(path)) files.add(path) for dirpath in dirs: if not osp.isdir(dirpath): os.makedirs(dirpath) for filepath in files: open(filepath, "w").close() class AttrObject: # XXX cf mock_object def __init__(self, **kwargs): self.__dict__.update(kwargs) def tag(*args: str, **kwargs: Any) -> Callable: """descriptor adding tag to a function""" def desc(func: Callable) -> Callable: assert not hasattr(func, "tags") # mypy: "Callable[..., Any]" has no attribute "tags" # dynamic magic attribute func.tags = Tags(*args, **kwargs) # type: ignore return func return desc def require_version(version: str) -> Callable: """Compare version of python interpreter to the given one. Skip the test if older. """ def check_require_version(f: Callable) -> Callable: version_elements = version.split(".") try: compare = tuple([int(v) for v in version_elements]) except ValueError: raise ValueError(f"{version} is not a correct version : should be X.Y[.Z].") current = sys.version_info[:3] if current < compare: def new_f(self, *args, **kwargs): self.skipTest( "Need at least %s version of python. Current version is %s." % (version, ".".join([str(element) for element in current])) ) new_f.__name__ = f.__name__ return new_f else: return f return check_require_version def require_module(module: str) -> Callable: """Check if the given module is loaded. Skip the test if not.""" def check_require_module(f: Callable) -> Callable: try: __import__(module) return f except ImportError: def new_f(self, *args, **kwargs): self.skipTest(f"{module} can not be imported.") new_f.__name__ = f.__name__ return new_f return check_require_module class SkipAwareTextTestRunner(unittest.TextTestRunner): def __init__( self, stream=sys.stderr, verbosity=1, exitfirst=False, pdbmode=False, cvg=None, test_pattern=None, skipped_patterns=(), colorize=False, batchmode=False, options=None, ): super(SkipAwareTextTestRunner, self).__init__(stream=stream, verbosity=verbosity) self.exitfirst = exitfirst self.pdbmode = pdbmode self.cvg = cvg self.test_pattern = test_pattern self.skipped_patterns = skipped_patterns self.colorize = colorize self.batchmode = batchmode self.options = options def does_match_tags(self, test: Callable) -> bool: if self.options is not None: tags_pattern = getattr(self.options, "tags_pattern", None) if tags_pattern is not None: tags = getattr(test, "tags", Tags()) if tags.inherit and isinstance(test, types.MethodType): tags = tags | getattr(test.__self__.__class__, "tags", Tags()) return tags.match(tags_pattern) return True # no pattern def _makeResult(self) -> "SkipAwareTestResult": return SkipAwareTestResult( self.stream, self.descriptions, self.verbosity, self.exitfirst, self.pdbmode, self.cvg, self.colorize, ) class SkipAwareTestResult(unittest.TextTestResult): def __init__( self, stream, descriptions: bool, verbosity: int, exitfirst: bool = False, pdbmode: bool = False, cvg: Optional[Any] = None, colorize: bool = False, ) -> None: super(SkipAwareTestResult, self).__init__(stream, descriptions, verbosity) self.skipped: List[Tuple[Any, Any]] = [] self.debuggers: List = [] self.fail_descrs: List = [] self.error_descrs: List = [] self.exitfirst = exitfirst self.pdbmode = pdbmode self.cvg = cvg self.colorize = colorize self.pdbclass = Debugger self.verbose = verbosity > 1 def descrs_for(self, flavour: str) -> List[Tuple[int, str]]: return getattr(self, f"{flavour.lower()}_descrs") def _create_pdb(self, test_descr: str, flavour: str) -> None: self.descrs_for(flavour).append((len(self.debuggers), test_descr)) if self.pdbmode: self.debuggers.append(self.pdbclass(sys.exc_info()[2])) def _iter_valid_frames(self, frames: List[FrameInfo]) -> Generator[FrameInfo, Any, None]: """only consider non-testlib frames when formatting traceback""" def invalid(fi): return osp.abspath(fi[1]) in (lgc_testlib, std_testlib) lgc_testlib = osp.abspath(__file__) std_testlib = osp.abspath(unittest.__file__) for frameinfo in dropwhile(invalid, frames): yield frameinfo def _exc_info_to_string(self, err, test): """Converts a sys.exc_info()-style tuple of values into a string. This method is overridden here because we want to colorize lines if --color is passed, and display local variables if --verbose is passed """ exctype, exc, tb = err output = ["Traceback (most recent call last)"] frames = inspect.getinnerframes(tb) colorize = self.colorize frames = enumerate(self._iter_valid_frames(frames)) for index, (frame, filename, lineno, funcname, ctx, ctxindex) in frames: filename = osp.abspath(filename) if ctx is None: # pyc files or C extensions for instance source = "" else: source = "".join(ctx) if colorize: filename = textutils.colorize_ansi(filename, "magenta") source = colorize_source(source) output.append(f' File "{filename}", line {lineno}, in {funcname}') output.append(f" {source.strip()}") if self.verbose: output.append(f"{dir(frame)!r} == {test.__module__!r}") output.append("") output.append(" " + " local variables ".center(66, "-")) for varname, value in sorted(frame.f_locals.items()): output.append(f" {varname}: {value!r}") if varname == "self": # special handy processing for self for varname, value in sorted(vars(value).items()): output.append(f" self.{varname}: {value!r}") output.append(" " + "-" * 66) output.append("") output.append("".join(traceback.format_exception_only(exctype, exc))) return "\n".join(output) def addError(self, test, err): """err -> (exc_type, exc, tcbk)""" exc_type, exc, _ = err if isinstance(exc, unittest.SkipTest): assert exc_type == unittest.SkipTest self.addSkip(test, exc) else: if self.exitfirst: self.shouldStop = True descr = self.getDescription(test) super(SkipAwareTestResult, self).addError(test, err) self._create_pdb(descr, "error") def addFailure(self, test, err): if self.exitfirst: self.shouldStop = True descr = self.getDescription(test) super(SkipAwareTestResult, self).addFailure(test, err) self._create_pdb(descr, "fail") def addSkip(self, test, reason): self.skipped.append((test, reason)) if self.showAll: self.stream.writeln("SKIPPED") elif self.dots: self.stream.write("S") def printErrors(self) -> None: super(SkipAwareTestResult, self).printErrors() self.printSkippedList() def printSkippedList(self) -> None: # format (test, err) compatible with unittest2 for test, err in self.skipped: descr = self.getDescription(test) self.stream.writeln(self.separator1) self.stream.writeln(f"{'SKIPPED'}: {descr}") self.stream.writeln(f"\t{err}") def printErrorList(self, flavour, errors): for (_, descr), (test, err) in zip(self.descrs_for(flavour), errors): self.stream.writeln(self.separator1) self.stream.writeln(f"{flavour}: {descr}") self.stream.writeln(self.separator2) self.stream.writeln(err) self.stream.writeln("no stdout".center(len(self.separator2))) self.stream.writeln("no stderr".center(len(self.separator2))) class NonStrictTestLoader(unittest.TestLoader): """ Overrides default testloader to be able to omit classname when specifying tests to run on command line. For example, if the file test_foo.py contains :: class FooTC(TestCase): def test_foo1(self): # ... def test_foo2(self): # ... def test_bar1(self): # ... class BarTC(TestCase): def test_bar2(self): # ... 'python test_foo.py' will run the 3 tests in FooTC 'python test_foo.py FooTC' will run the 3 tests in FooTC 'python test_foo.py test_foo' will run test_foo1 and test_foo2 'python test_foo.py test_foo1' will run test_foo1 'python test_foo.py test_bar' will run FooTC.test_bar1 and BarTC.test_bar2 """ def __init__(self) -> None: self.skipped_patterns = () # some magic here to accept empty list by extending # and to provide callable capability def loadTestsFromNames(self, names: List[str], module: type = None) -> TestSuite: suites = [] for name in names: suites.extend(self.loadTestsFromName(name, module)) return self.suiteClass(suites) def _collect_tests(self, module: type) -> Dict[str, Tuple[type, List[str]]]: tests = {} for obj in vars(module).values(): if isclass(obj) and issubclass(obj, unittest.TestCase): classname = obj.__name__ if classname[0] == "_" or self._this_is_skipped(classname): continue methodnames = [] # obj is a TestCase class for attrname in dir(obj): if attrname.startswith(self.testMethodPrefix): attr = getattr(obj, attrname) if callable(attr): methodnames.append(attrname) # keep track of class (obj) for convenience tests[classname] = (obj, methodnames) return tests def loadTestsFromSuite(self, module, suitename): try: suite = getattr(module, suitename)() except AttributeError: return [] assert hasattr(suite, "_tests"), "%s.%s is not a valid TestSuite" % ( module.__name__, suitename, ) # python2.3 does not implement __iter__ on suites, we need to return # _tests explicitly return suite._tests def loadTestsFromName(self, name, module=None): parts = name.split(".") if module is None or len(parts) > 2: # let the base class do its job here return [super(NonStrictTestLoader, self).loadTestsFromName(name)] tests = self._collect_tests(module) collected = [] if len(parts) == 1: pattern = parts[0] if callable(getattr(module, pattern, None)) and pattern not in tests: # consider it as a suite return self.loadTestsFromSuite(module, pattern) if pattern in tests: # case python unittest_foo.py MyTestTC klass, methodnames = tests[pattern] for methodname in methodnames: collected = [klass(methodname) for methodname in methodnames] else: # case python unittest_foo.py something for klass, methodnames in tests.values(): # skip methodname if matched by skipped_patterns for skip_pattern in self.skipped_patterns: methodnames = [ methodname for methodname in methodnames if skip_pattern not in methodname ] collected += [ klass(methodname) for methodname in methodnames if pattern in methodname ] elif len(parts) == 2: # case "MyClass.test_1" classname, pattern = parts klass, methodnames = tests.get(classname, (None, [])) for methodname in methodnames: collected = [ klass(methodname) for methodname in methodnames if pattern in methodname ] return collected def _this_is_skipped(self, testedname: str) -> bool: # mypy: Need type annotation for 'pat' # doc doesn't say how to that in list comprehension return any([(pat in testedname) for pat in self.skipped_patterns]) # type: ignore def getTestCaseNames(self, testCaseClass: type) -> List[str]: """Return a sorted sequence of method names found within testCaseClass""" is_skipped = self._this_is_skipped classname = testCaseClass.__name__ if classname[0] == "_" or is_skipped(classname): return [] testnames = super(NonStrictTestLoader, self).getTestCaseNames(testCaseClass) return [testname for testname in testnames if not is_skipped(testname)] # The 2 functions below are modified versions of the TestSuite.run method # that is provided with unittest2 for python 2.6, in unittest2/suite.py # It is used to monkeypatch the original implementation to support # extra runcondition and options arguments (see in testlib.py) def _ts_wrapped_run( self: Any, result: SkipAwareTestResult, debug: bool = False, runcondition: Callable = None, options: Optional[Any] = None, ) -> SkipAwareTestResult: for test in self: if result.shouldStop: break if unittest.suite._isnotsuite(test): self._tearDownPreviousClass(test, result) self._handleModuleFixture(test, result) self._handleClassSetUp(test, result) result._previousTestClass = test.__class__ if getattr(test.__class__, "_classSetupFailed", False) or getattr( result, "_moduleSetUpFailed", False ): continue # --- modifications to deal with _wrapped_run --- # original code is: # # if not debug: # test(result) # else: # test.debug() if hasattr(test, "_wrapped_run"): try: test._wrapped_run(result, debug, runcondition=runcondition, options=options) except TypeError: test._wrapped_run(result, debug) elif not debug: try: test(result, runcondition, options) except TypeError: test(result) else: test.debug() # --- end of modifications to deal with _wrapped_run --- return result def _ts_run( # noqa self: Any, result: SkipAwareTestResult, debug: bool = False, runcondition: Callable = None, options: Optional[Any] = None, ) -> SkipAwareTestResult: topLevel = False if getattr(result, "_testRunEntered", False) is False: result._testRunEntered = topLevel = True self._wrapped_run(result, debug, runcondition, options) if topLevel: self._tearDownPreviousClass(None, result) self._handleModuleTearDown(result) result._testRunEntered = False return result # monkeypatch unittest and doctest (ouch !) unittest.TextTestResult = SkipAwareTestResult unittest.TextTestRunner = SkipAwareTextTestRunner unittest.TestLoader = NonStrictTestLoader unittest.FunctionTestCase.__bases__ = (TestCase,) unittest.TestSuite.run = _ts_run unittest.TestSuite._wrapped_run = _ts_wrapped_run ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/textutils.py0000666000000000000000000004343014762603732020573 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Some text manipulation utility functions. :group text formatting: normalize_text, normalize_paragraph, pretty_match,\ unquote, colorize_ansi :group text manipulation: searchall, splitstrip :sort: text formatting, text manipulation :type ANSI_STYLES: dict(str) :var ANSI_STYLES: dictionary mapping style identifier to ANSI terminal code :type ANSI_COLORS: dict(str) :var ANSI_COLORS: dictionary mapping color identifier to ANSI terminal code :type ANSI_PREFIX: str :var ANSI_PREFIX: ANSI terminal code notifying the start of an ANSI escape sequence :type ANSI_END: str :var ANSI_END: ANSI terminal code notifying the end of an ANSI escape sequence :type ANSI_RESET: str :var ANSI_RESET: ANSI terminal code resetting format defined by a previous ANSI escape sequence """ __docformat__ = "restructuredtext en" import sys import re import os.path as osp try: from re import Pattern, Match # type: ignore except ImportError: # Pattern and Match are python > 3.6 only. # # To be compatible with python <= 3.6, and still provide some typing, we # manually define Pattern and Match, in the same manner they are defined in # the re module of python > 3.7 # cf https://github.com/python/cpython/blob/3.7/Lib/re.py#L264 Pattern = type(re.sre_compile.compile("", 0)) # type: ignore Match = type(re.sre_compile.compile("", 0).match("")) # type: ignore from unicodedata import normalize as _uninormalize from typing import Optional, Tuple, List, Callable, Dict, Union try: from os import linesep except ImportError: linesep = "\n" # gae MANUAL_UNICODE_MAP = { "\xa1": "!", # INVERTED EXCLAMATION MARK "\u0142": "l", # LATIN SMALL LETTER L WITH STROKE "\u2044": "/", # FRACTION SLASH "\xc6": "AE", # LATIN CAPITAL LETTER AE "\xa9": "(c)", # COPYRIGHT SIGN "\xab": '"', # LEFT-POINTING DOUBLE ANGLE QUOTATION MARK "\xe6": "ae", # LATIN SMALL LETTER AE "\xae": "(r)", # REGISTERED SIGN "\u0153": "oe", # LATIN SMALL LIGATURE OE "\u0152": "OE", # LATIN CAPITAL LIGATURE OE "\xd8": "O", # LATIN CAPITAL LETTER O WITH STROKE "\xf8": "o", # LATIN SMALL LETTER O WITH STROKE "\xbb": '"', # RIGHT-POINTING DOUBLE ANGLE QUOTATION MARK "\xdf": "ss", # LATIN SMALL LETTER SHARP S "\u2013": "-", # HYPHEN "\u2019": "'", # SIMPLE QUOTE } def unormalize(ustring: str, substitute: Optional[str] = None) -> str: """replace diacritical characters with their corresponding ascii characters Convert the unicode string to its long normalized form (unicode character will be transform into several characters) and keep the first one only. The normal form KD (NFKD) will apply the compatibility decomposition, i.e. replace all compatibility characters with their equivalents. :type substitute: str :param substitute: replacement character to use if decomposition fails :see: Another project about ASCII transliterations of Unicode text http://pypi.python.org/pypi/Unidecode """ res = [] for letter in ustring[:]: try: replacement = MANUAL_UNICODE_MAP[letter] except KeyError: replacement = _uninormalize("NFKD", letter)[0] if ord(replacement) >= 2**7: if substitute is None: raise ValueError("can't deal with non-ascii based characters") replacement = substitute res.append(replacement) return "".join(res) def unquote(string: str) -> str: """remove optional quotes (simple or double) from the string :type string: str or unicode :param string: an optionally quoted string :rtype: str or unicode :return: the unquoted string (or the input string if it wasn't quoted) """ if not string: return string if string[0] in "\"'": string = string[1:] if string[-1] in "\"'": string = string[:-1] return string _BLANKLINES_RGX = re.compile("\r?\n\r?\n") _NORM_SPACES_RGX = re.compile(r"\s+") def normalize_text(text: str, line_len: int = 80, indent: str = "", rest: bool = False) -> str: """normalize a text to display it with a maximum line size and optionally arbitrary indentation. Line jumps are normalized but blank lines are kept. The indentation string may be used to insert a comment (#) or a quoting (>) mark for instance. :type text: str or unicode :param text: the input text to normalize :type line_len: int :param line_len: expected maximum line's length, default to 80 :type indent: str or unicode :param indent: optional string to use as indentation :rtype: str or unicode :return: the input text normalized to fit on lines with a maximized size inferior to `line_len`, and optionally prefixed by an indentation string """ if rest: normp = normalize_rest_paragraph else: normp = normalize_paragraph result = [] for text in _BLANKLINES_RGX.split(text): result.append(normp(text, line_len, indent)) return (f"{linesep}{indent}{linesep}").join(result) def normalize_paragraph(text: str, line_len: int = 80, indent: str = "") -> str: """normalize a text to display it with a maximum line size and optionally arbitrary indentation. Line jumps are normalized. The indentation string may be used top insert a comment mark for instance. :type text: str or unicode :param text: the input text to normalize :type line_len: int :param line_len: expected maximum line's length, default to 80 :type indent: str or unicode :param indent: optional string to use as indentation :rtype: str or unicode :return: the input text normalized to fit on lines with a maximized size inferior to `line_len`, and optionally prefixed by an indentation string """ text = _NORM_SPACES_RGX.sub(" ", text) line_len = line_len - len(indent) lines = [] while text: aline, text = splittext(text.strip(), line_len) lines.append(indent + aline) return linesep.join(lines) def normalize_rest_paragraph(text: str, line_len: int = 80, indent: str = "") -> str: """normalize a ReST text to display it with a maximum line size and optionally arbitrary indentation. Line jumps are normalized. The indentation string may be used top insert a comment mark for instance. :type text: str or unicode :param text: the input text to normalize :type line_len: int :param line_len: expected maximum line's length, default to 80 :type indent: str or unicode :param indent: optional string to use as indentation :rtype: str or unicode :return: the input text normalized to fit on lines with a maximized size inferior to `line_len`, and optionally prefixed by an indentation string """ toreport = "" lines = [] line_len = line_len - len(indent) for line in text.splitlines(): line = toreport + _NORM_SPACES_RGX.sub(" ", line.strip()) toreport = "" while len(line) > line_len: # too long line, need split line, toreport = splittext(line, line_len) lines.append(indent + line) if toreport: line = toreport + " " toreport = "" else: line = "" if line: lines.append(indent + line.strip()) return linesep.join(lines) def splittext(text: str, line_len: int) -> Tuple[str, str]: """split the given text on space according to the given max line size return a 2-uple: * a line <= line_len if possible * the rest of the text which has to be reported on another line """ if len(text) <= line_len: return text, "" pos = min(len(text) - 1, line_len) while pos > 0 and text[pos] != " ": pos -= 1 if pos == 0: pos = min(len(text), line_len) while len(text) > pos and text[pos] != " ": pos += 1 return text[:pos], text[pos + 1 :].strip() def splitstrip(string: str, sep: str = ",") -> List[str]: """return a list of stripped string by splitting the string given as argument on `sep` (',' by default). Empty string are discarded. >>> splitstrip('a, b, c , 4,,') ['a', 'b', 'c', '4'] >>> splitstrip('a') ['a'] >>> :type string: str or unicode :param string: a csv line :type sep: str or unicode :param sep: field separator, default to the comma (',') :rtype: str or unicode :return: the unquoted string (or the input string if it wasn't quoted) """ return [word.strip() for word in string.split(sep) if word.strip()] def split_url_or_path(url_or_path): """return the latest component of a string containing either an url of the form :// or a local file system path """ if "://" in url_or_path: return url_or_path.rstrip("/").rsplit("/", 1) return osp.split(url_or_path.rstrip(osp.sep)) def text_to_dict(text): """parse multilines text containing simple 'key=value' lines and return a dict of {'key': 'value'}. When the same key is encountered multiple time, value is turned into a list containing all values. >>> d = text_to_dict('''multiple=1 ... multiple= 2 ... single =3 ... ''') >>> d['single'] '3' >>> d['multiple'] ['1', '2'] """ res = {} if not text: return res for line in text.splitlines(): line = line.strip() if line and not line.startswith("#"): key, value = [w.strip() for w in line.split("=", 1)] if key in res: try: res[key].append(value) except AttributeError: res[key] = [res[key], value] else: res[key] = value return res _BLANK_URE = r"(\s|,)+" _BLANK_RE = re.compile(_BLANK_URE) __VALUE_URE = r"-?(([0-9]+\.[0-9]*)|((0x?)?[0-9]+))" __UNITS_URE = r"[a-zA-Z]+" _VALUE_RE = re.compile(r"(?P%s)(?P%s)?" % (__VALUE_URE, __UNITS_URE)) _VALIDATION_RE = re.compile(r"^((%s)(%s))*(%s)?$" % (__VALUE_URE, __UNITS_URE, __VALUE_URE)) BYTE_UNITS = { "b": 1, "kb": 1024, "mb": 1024**2, "gb": 1024**3, "tb": 1024**4, } TIME_UNITS = { "ms": 0.0001, "s": 1, "min": 60, "h": 60 * 60, "d": 60 * 60 * 24, } def apply_units( string: str, units: Dict[str, int], inter: Union[Callable, None, type] = None, final: type = float, blank_reg: Pattern = _BLANK_RE, value_reg: Pattern = _VALUE_RE, ) -> Union[float, int]: """Parse the string applying the units defined in units (e.g.: "1.5m",{'m',60} -> 80). :type string: str or unicode :param string: the string to parse :type units: dict (or any object with __getitem__ using basestring key) :param units: a dict mapping a unit string repr to its value :type inter: type :param inter: used to parse every intermediate value (need __sum__) :type blank_reg: regexp :param blank_reg: should match every blank char to ignore. :type value_reg: regexp with "value" and optional "unit" group :param value_reg: match a value and it's unit into the """ if inter is None: inter = final fstring = _BLANK_RE.sub("", string) if not (fstring and _VALIDATION_RE.match(fstring)): raise ValueError(f"Invalid unit string: {string!r}.") values = [] for match in value_reg.finditer(fstring): dic = match.groupdict() lit, unit = dic["value"], dic.get("unit") value = inter(lit) if unit is not None: try: value *= units[unit.lower()] except KeyError: raise ValueError(f"invalid unit {unit}. valid units are {list(units.keys())}") values.append(value) return final(sum(values)) _LINE_RGX = re.compile("\r\n|\r+|\n") def pretty_match(match: Match, string: str, underline_char: str = "^") -> str: """return a string with the match location underlined: >>> import re >>> print(pretty_match(re.search('mange', 'il mange du bacon'), 'il mange du bacon')) il mange du bacon ^^^^^ >>> :type match: _sre.SRE_match :param match: object returned by re.match, re.search or re.finditer :type string: str or unicode :param string: the string on which the regular expression has been applied to obtain the `match` object :type underline_char: str or unicode :param underline_char: character to use to underline the matched section, default to the carret '^' :rtype: str or unicode :return: the original string with an inserted line to underline the match location """ start = match.start() end = match.end() string = _LINE_RGX.sub(linesep, string) start_line_pos = string.rfind(linesep, 0, start) if start_line_pos == -1: start_line_pos = 0 result = [] else: result = [string[:start_line_pos]] start_line_pos += len(linesep) offset = start - start_line_pos underline = " " * offset + underline_char * (end - start) end_line_pos = string.find(linesep, end) if end_line_pos == -1: string = string[start_line_pos:] result.append(string) result.append(underline) else: # mypy: Incompatible types in assignment (expression has type "str", # mypy: variable has type "int") # but it's a str :| end = string[end_line_pos + len(linesep) :] # type: ignore string = string[start_line_pos:end_line_pos] result.append(string) result.append(underline) result.append(end) # type: ignore # see previous comment return linesep.join(result).rstrip() # Ansi colorization ########################################################### ANSI_PREFIX = "\033[" ANSI_END = "m" ANSI_RESET = "\033[0m" ANSI_STYLES = { "reset": "0", "bold": "1", "italic": "3", "underline": "4", "blink": "5", "inverse": "7", "strike": "9", } ANSI_COLORS = { "reset": "0", "black": "30", "red": "31", "green": "32", "yellow": "33", "blue": "34", "magenta": "35", "cyan": "36", "white": "37", "background_black": "40", "background_red": "41", "background_green": "42", "background_yellow": "43", "background_blue": "44", "background_magenta": "45", "background_cyan": "46", "background_white": "47", } def _get_ansi_code(color: Optional[str] = None, style: Optional[str] = None) -> str: """return ansi escape code corresponding to color and style :type color: str or None :param color: the color name (see `ANSI_COLORS` for available values) or the color number when 256 colors are available :type style: str or None :param style: style string (see `ANSI_COLORS` for available values). To get several style effects at the same time, use a coma as separator. :raise KeyError: if an unexistent color or style identifier is given :rtype: str :return: the built escape code """ ansi_code = [] if style: style_attrs = splitstrip(style) for effect in style_attrs: ansi_code.append(ANSI_STYLES[effect]) if color: if color.isdigit(): ansi_code.extend(["38", "5"]) ansi_code.append(color) else: ansi_code.append(ANSI_COLORS[color]) if ansi_code: return ANSI_PREFIX + ";".join(ansi_code) + ANSI_END return "" def colorize_ansi(msg: str, color: Optional[str] = None, style: Optional[str] = None) -> str: """colorize message by wrapping it with ansi escape codes :type msg: str or unicode :param msg: the message string to colorize :type color: str or None :param color: the color identifier (see `ANSI_COLORS` for available values) :type style: str or None :param style: style string (see `ANSI_COLORS` for available values). To get several style effects at the same time, use a coma as separator. :raise KeyError: if an unexistent color or style identifier is given :rtype: str or unicode :return: the ansi escaped string """ # If both color and style are not defined, then leave the text as is if color is None and style is None: return msg escape_code = _get_ansi_code(color, style) # If invalid (or unknown) color, don't wrap msg with ansi codes if escape_code: return f"{escape_code}{msg}{ANSI_RESET}" return msg DIFF_STYLE = {"separator": "cyan", "remove": "red", "add": "green"} def diff_colorize_ansi(lines, out=sys.stdout, style=DIFF_STYLE): for line in lines: if line[:4] in ("--- ", "+++ "): out.write(colorize_ansi(line, style["separator"])) elif line[0] == "-": out.write(colorize_ansi(line, style["remove"])) elif line[0] == "+": out.write(colorize_ansi(line, style["add"])) elif line[:4] == "--- ": out.write(colorize_ansi(line, style["separator"])) elif line[:4] == "+++ ": out.write(colorize_ansi(line, style["separator"])) else: out.write(line) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/tree.py0000666000000000000000000002657714762603732017502 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Base class to represent a tree structure.""" __docformat__ = "restructuredtext en" import sys from logilab.common.visitor import VisitedMixIn, FilteredIterator, no_filter from typing import Optional, Any, List, Callable # Exceptions ################################################################# class NodeNotFound(Exception): """raised when a node has not been found""" EX_SIBLING_NOT_FOUND: str = "No such sibling as '%s'" EX_CHILD_NOT_FOUND: str = "No such child as '%s'" EX_NODE_NOT_FOUND: str = "No such node as '%s'" # Base node ################################################################### # describe node of current class NodeType = Any class Node: """a basic tree node, characterized by an id""" def __init__(self, nid: Optional[str] = None) -> None: self.id = nid # navigation # should be something like Optional[type(self)] for subclasses but that's not possible? self.parent: Optional[NodeType] = None # should be something like List[type(self)] for subclasses but that's not possible? self.children: List[NodeType] = [] def __iter__(self): return iter(self.children) def __str__(self, indent=0): s = [f"{' ' * indent}{self.__class__.__name__} {self.id}"] indent += 2 for child in self.children: try: s.append(child.__str__(indent)) except TypeError: s.append(child.__str__()) return "\n".join(s) def is_leaf(self): return not self.children def append(self, child: NodeType) -> None: # should be child: type(self) but that's not possible """add a node to children""" self.children.append(child) child.parent = self def remove(self, child: NodeType) -> None: # should be child: type(self) but that's not possible """remove a child node""" self.children.remove(child) child.parent = None def insert(self, index: int, child: NodeType) -> None: # should be child: type(self) but that's not possible """insert a child node""" self.children.insert(index, child) child.parent = self def replace(self, old_child: NodeType, new_child: NodeType) -> None: """replace a child node with another""" i = self.children.index(old_child) self.children.pop(i) self.children.insert(i, new_child) new_child.parent = self def get_sibling(self, nid: str) -> NodeType: """return the sibling node that has given id""" try: assert self.parent is not None return self.parent.get_child_by_id(nid) except NodeNotFound: raise NodeNotFound(EX_SIBLING_NOT_FOUND % nid) def next_sibling(self): """ return the next sibling for this node if any """ parent = self.parent if parent is None: # root node has no sibling return None index = parent.children.index(self) try: return parent.children[index + 1] except IndexError: return None def previous_sibling(self): """ return the previous sibling for this node if any """ parent = self.parent if parent is None: # root node has no sibling return None index = parent.children.index(self) if index > 0: return parent.children[index - 1] return None def get_node_by_id(self, nid: str) -> NodeType: """ return node in whole hierarchy that has given id """ root = self.root() try: return root.get_child_by_id(nid, 1) except NodeNotFound: raise NodeNotFound(EX_NODE_NOT_FOUND % nid) def get_child_by_id(self, nid: str, recurse: Optional[bool] = None) -> NodeType: """ return child of given id """ if self.id == nid: return self for c in self.children: if recurse: try: return c.get_child_by_id(nid, 1) except NodeNotFound: continue if c.id == nid: return c raise NodeNotFound(EX_CHILD_NOT_FOUND % nid) def get_child_by_path(self, path: List[str]) -> NodeType: """ return child of given path (path is a list of ids) """ if len(path) > 0 and path[0] == self.id: if len(path) == 1: return self else: for c in self.children: try: return c.get_child_by_path(path[1:]) except NodeNotFound: pass raise NodeNotFound(EX_CHILD_NOT_FOUND % path) def depth(self) -> int: """ return depth of this node in the tree """ if self.parent is not None: return 1 + self.parent.depth() else: return 0 def depth_down(self) -> int: """ return depth of the tree from this node """ if self.children: return 1 + max([c.depth_down() for c in self.children]) return 1 def width(self) -> int: """ return the width of the tree from this node """ return len(self.leaves()) def root(self) -> NodeType: """ return the root node of the tree """ if self.parent is not None: return self.parent.root() return self def leaves(self) -> List[NodeType]: """ return a list with all the leaves nodes descendant from this node """ leaves = [] if self.children: for child in self.children: leaves += child.leaves() return leaves else: return [self] def flatten(self, _list: Optional[List[NodeType]] = None) -> List[NodeType]: """ return a list with all the nodes descendant from this node """ if _list is None: _list = [] _list.append(self) for c in self.children: c.flatten(_list) return _list def lineage(self) -> List[NodeType]: """ return list of parents up to root node """ lst = [self] if self.parent is not None: lst.extend(self.parent.lineage()) return lst class VNode(Node, VisitedMixIn): # we should probably merge this VisitedMixIn here because it's only used here """a visitable node""" class BinaryNode(VNode): """a binary node (i.e. only two children""" def __init__(self, lhs=None, rhs=None): VNode.__init__(self) if lhs is not None or rhs is not None: assert lhs and rhs self.append(lhs) self.append(rhs) def remove(self, child): """remove the child and replace this node with the other child""" self.children.remove(child) self.parent.replace(self, self.children[0]) def get_parts(self): """ return the left hand side and the right hand side of this node """ return self.children[0], self.children[1] if sys.version_info[0:2] >= (2, 2): list_class = list else: from UserList import UserList list_class = UserList class ListNode(VNode, list_class): """Used to manipulate Nodes as Lists""" def __init__(self): list_class.__init__(self) VNode.__init__(self) self.children = self def __str__(self, indent=0): return "%s%s %s" % ( indent * " ", self.__class__.__name__, ", ".join([str(v) for v in self]), ) def append(self, child): """add a node to children""" list_class.append(self, child) child.parent = self def insert(self, index, child): """add a node to children""" list_class.insert(self, index, child) child.parent = self def remove(self, child): """add a node to children""" list_class.remove(self, child) child.parent = None def pop(self, index): """add a node to children""" child = list_class.pop(self, index) child.parent = None def __iter__(self): return list_class.__iter__(self) # construct list from tree #################################################### def post_order_list(node: Optional[Node], filter_func: Callable = no_filter) -> List[Node]: """ create a list with tree nodes for which the function returned true in a post order fashion """ l, stack = [], [] poped, index = 0, 0 while node: if filter_func(node): if node.children and not poped: stack.append((node, index)) index = 0 node = node.children[0] else: l.append(node) index += 1 try: node = stack[-1][0].children[index] except IndexError: node = None else: node = None poped = 0 if node is None and stack: node, index = stack.pop() poped = 1 return l def pre_order_list(node: Optional[Node], filter_func: Callable = no_filter) -> List[Node]: """ create a list with tree nodes for which the function returned true in a pre order fashion """ l, stack = [], [] poped, index = 0, 0 while node: if filter_func(node): if not poped: l.append(node) if node.children and not poped: stack.append((node, index)) index = 0 node = node.children[0] else: index += 1 try: node = stack[-1][0].children[index] except IndexError: node = None else: node = None poped = 0 if node is None and len(stack) > 1: node, index = stack.pop() poped = 1 return l class PostfixedDepthFirstIterator(FilteredIterator): """a postfixed depth first iterator, designed to be used with visitors""" def __init__(self, node: Node, filter_func: Optional[Any] = None) -> None: FilteredIterator.__init__(self, node, post_order_list, filter_func) class PrefixedDepthFirstIterator(FilteredIterator): """a prefixed depth first iterator, designed to be used with visitors""" def __init__(self, node: Node, filter_func: Optional[Any] = None) -> None: FilteredIterator.__init__(self, node, pre_order_list, filter_func) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/types.py0000666000000000000000000000325614762603732017674 0ustar00rootroot# copyright 2019 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of yams. # # yams is free software: you can redistribute it and/or modify it under the # terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) # any later version. # # yams is distributed in the hope that it will be useful, but WITHOUT ANY # WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR # A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with yams. If not, see . """Types declarations for types annotations""" from typing import TYPE_CHECKING, TypeVar # to avoid circular imports if TYPE_CHECKING: from logilab.common.tree import Node from logilab.common.ureports.html_writer import HTMLWriter from logilab.common.ureports.text_writer import TextWriter from logilab.common.ureports.nodes import Paragraph from logilab.common.ureports.nodes import Title from logilab.common.table import Table from logilab.common.optik_ext import OptionParser from logilab.common.optik_ext import Option from logilab.common import attrdict else: Node = TypeVar("Node") HTMLWriter = TypeVar("HTMLWriter") TextWriter = TypeVar("TextWriter") Table = TypeVar("Table") OptionParser = TypeVar("OptionParser") Option = TypeVar("Option") attrdict = TypeVar("attrdict") Paragraph = TypeVar("Paragraph") Title = TypeVar("Title") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/umessage.py0000666000000000000000000001475614762603732020350 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Unicode email support (extends email from stdlib)""" __docformat__ = "restructuredtext en" import email from encodings import search_function import sys from email.utils import parseaddr, parsedate from email.header import decode_header from email.message import Message from datetime import datetime from typing import Any, Optional, List, Tuple, Union try: from mx.DateTime import DateTime except ImportError: DateTime = datetime import logilab.common as lgc def decode_QP(string: str) -> str: parts: List[str] = [] for maybe_decoded, charset in decode_header(string): if not charset: charset = "iso-8859-15" # python 3 sometimes returns str and sometimes bytes. # the 'official' fix is to use the new 'policy' APIs # https://bugs.python.org/issue24797 # let's just handle this bug ourselves for now if isinstance(maybe_decoded, bytes): decoded = maybe_decoded.decode(charset, "replace") else: decoded = maybe_decoded assert isinstance(decoded, str) parts.append(decoded) if sys.version_info < (3, 3): # decoding was non-RFC compliant wrt to whitespace handling # see http://bugs.python.org/issue1079 return " ".join(parts) return "".join(parts) def message_from_file(fd): try: return UMessage(email.message_from_file(fd)) except email.errors.MessageParseError: return "" def message_from_string(string: str) -> Union["UMessage", str]: try: return UMessage(email.message_from_string(string)) except email.errors.MessageParseError: return "" class UMessage: """Encapsulates an email.Message instance and returns only unicode objects.""" def __init__(self, message: Message) -> None: self.message = message # email.Message interface ################################################# def get(self, header: str, default: Optional[Any] = None) -> Optional[str]: value = self.message.get(header, default) if value: return decode_QP(value) return value def __getitem__(self, header): return self.get(header) def get_all(self, header: str, default: Tuple[()] = ()) -> List[str]: return [decode_QP(val) for val in self.message.get_all(header, default) if val is not None] def is_multipart(self): return self.message.is_multipart() def get_boundary(self): return self.message.get_boundary() def walk(self): for part in self.message.walk(): yield UMessage(part) def get_payload( self, index: Optional[Any] = None, decode: bool = False ) -> Union[str, "UMessage", List["UMessage"]]: message = self.message if index is None: # mypy: Argument 1 to "get_payload" of "Message" has incompatible type "None"; # mypy: expected "int" # email.message.Message.get_payload has type signature: # Message.get_payload(self, i=None, decode=False) # so None seems to be totally acceptable, I don't understand mypy here payload = message.get_payload(index, decode) # type: ignore if isinstance(payload, list): return [UMessage(msg) for msg in payload] if message.get_content_maintype() != "text": return payload if isinstance(payload, str): return payload charset = message.get_content_charset() or "iso-8859-1" if search_function(charset) is None: charset = "iso-8859-1" return str(payload or b"", charset, "replace") else: payload = UMessage(message.get_payload(index, decode)) return payload def get_content_maintype(self): return str(self.message.get_content_maintype()) def get_content_type(self): return str(self.message.get_content_type()) def get_filename(self, failobj=None): value = self.message.get_filename(failobj) if value is failobj: return value try: return str(value) except UnicodeDecodeError: return "error decoding filename" # other convenience methods ############################################### def headers(self): """return an unicode string containing all the message's headers""" values = [] for header in self.message.keys(): values.append(f"{header}: {self.get(header)}") return "\n".join(values) def multi_addrs(self, header): """return a list of 2-uple (name, address) for the given address (which is expected to be an header containing address such as from, to, cc...) """ persons = [] for person in self.get_all(header, ()): name, mail = parseaddr(person) persons.append((name, mail)) return persons def date(self, alternative_source=False, return_str=False): """return a datetime object for the email's date or None if no date is set or if it can't be parsed """ value = self.get("date") if value is None and alternative_source: unix_from = self.message.get_unixfrom() if unix_from is not None: try: value = unix_from.split(" ", 2)[2] except IndexError: pass if value is not None: datetuple = parsedate(value) if datetuple: if lgc.USE_MX_DATETIME: return DateTime(*datetuple[:6]) return datetime(*datetuple[:6]) elif not return_str: return None return value ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.8901825 logilab_common-2.1.0/logilab/common/ureports/0000755000000000000000000000000014762603767020037 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/ureports/__init__.py0000666000000000000000000001752014762603732022151 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it # under the terms of the GNU Lesser General Public License as published by the # Free Software Foundation, either version 2.1 of the License, # or (at your option) any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with logilab-common. If not, see . """Universal report objects and some formatting drivers. A way to create simple reports using python objects, primarily designed to be formatted as text and html. """ __docformat__ = "restructuredtext en" import sys from typing import Any, Optional, Union, List as ListType, Generator, Tuple, Callable, TextIO from io import StringIO from logilab.common.textutils import linesep from logilab.common.tree import VNode from logilab.common.ureports.nodes import Table, Section, Link, Paragraph, Title, Text from logilab.common.ureports.nodes import VerbatimText, Image, Span, List # noqa def get_nodes(node, klass): """return an iterator on all children node of the given klass""" for child in node.children: if isinstance(child, klass): yield child # recurse (FIXME: recursion controled by an option) for grandchild in get_nodes(child, klass): yield grandchild def layout_title(layout): """try to return the layout's title as string, return None if not found""" for child in layout.children: if isinstance(child, Title): return " ".join([node.data for node in get_nodes(child, Text)]) def build_summary(layout, level=1): """make a summary for the report, including X level""" assert level > 0 level -= 1 summary = ListType(klass="summary") for child in layout.children: if not isinstance(child, Section): continue label = layout_title(child) if not label and not child.id: continue if not child.id: child.id = label.replace(" ", "-") node = Link("#" + child.id, label=label or child.id) # FIXME: Three following lines produce not very compliant # docbook: there are some useless . They might be # replaced by the three commented lines but this then produces # a bug in html display... if level and [n for n in child.children if isinstance(n, Section)]: node = Paragraph([node, build_summary(child, level)]) summary.append(node) # summary.append(node) # if level and [n for n in child.children if isinstance(n, Section)]: # summary.append(build_summary(child, level)) return summary class BaseWriter: """base class for ureport writers""" def format( self, layout: Any, stream: Optional[Union[StringIO, TextIO]] = None, encoding: Optional[Any] = None, ) -> None: """format and write the given layout into the stream object unicode policy: unicode strings may be found in the layout; try to call stream.write with it, but give it back encoded using the given encoding if it fails """ if stream is None: stream = sys.stdout if not encoding: encoding = getattr(stream, "encoding", "UTF-8") self.encoding = encoding or "UTF-8" self.__compute_funcs: ListType[Tuple[Callable[[str], Any], Callable[[str], Any]]] = [] self.out = stream self.begin_format(layout) layout.accept(self) self.end_format(layout) def format_children(self, layout: Union["Paragraph", "Section", "Title"]) -> None: """recurse on the layout children and call their accept method (see the Visitor pattern) """ for child in getattr(layout, "children", ()): child.accept(self) def writeln(self, string: str = "") -> None: """write a line in the output buffer""" self.write(string + linesep) def write(self, string: str) -> None: """write a string in the output buffer""" try: self.out.write(string) except UnicodeEncodeError: # mypy: Argument 1 to "write" of "IO" has incompatible type "bytes"; expected "str" # probably a python3 port issue? self.out.write(string.encode(self.encoding)) # type: ignore def begin_format(self, layout: Any) -> None: """begin to format a layout""" self.section = 0 def end_format(self, layout: Any) -> None: """finished to format a layout""" def get_table_content(self, table: Table) -> ListType[ListType[str]]: """trick to get table content without actually writing it return an aligned list of lists containing table cells values as string """ result: ListType[ListType[str]] = [[]] # mypy: "Table" has no attribute "cols" # dynamic attribute cols = table.cols # type: ignore for cell in self.compute_content(table): if cols == 0: result.append([]) # mypy: "Table" has no attribute "cols" # dynamic attribute cols = table.cols # type: ignore cols -= 1 result[-1].append(cell) # fill missing cells while len(result[-1]) < cols: result[-1].append("") return result def compute_content(self, layout: VNode) -> Generator[str, Any, None]: """trick to compute the formatting of children layout before actually writing it return an iterator on strings (one for each child element) """ # use cells ! def write(data: str) -> None: try: stream.write(data) except UnicodeEncodeError: # mypy: Argument 1 to "write" of "TextIOWrapper" has incompatible type "bytes"; # mypy: expected "str" # error from porting to python3? stream.write(data.encode(self.encoding)) # type: ignore def writeln(data: str = "") -> None: try: stream.write(data + linesep) except UnicodeEncodeError: # mypy: Unsupported operand types for + ("bytes" and "str") # error from porting to python3? stream.write(data.encode(self.encoding) + linesep) # type: ignore # mypy: Cannot assign to a method # this really looks like black dirty magic since self.write is reused elsewhere in the code # especially since self.write and self.writeln are conditionally # deleted at the end of this function self.write = write # type: ignore self.writeln = writeln # type: ignore self.__compute_funcs.append((write, writeln)) # mypy: Item "Table" of "Union[ListType[Any], Table, Title]" has no attribute "children" # dynamic attribute? for child in layout.children: # type: ignore stream = StringIO() child.accept(self) yield stream.getvalue() self.__compute_funcs.pop() try: # mypy: Cannot assign to a method # even more black dirty magic self.write, self.writeln = self.__compute_funcs[-1] # type: ignore except IndexError: del self.write del self.writeln from logilab.common.ureports.text_writer import TextWriter # noqa from logilab.common.ureports.html_writer import HTMLWriter # noqa ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/ureports/docbook_writer.py0000666000000000000000000001303014762603732023416 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it # under the terms of the GNU Lesser General Public License as published by the # Free Software Foundation, either version 2.1 of the License, # or (at your option) any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with logilab-common. If not, see . """HTML formatting drivers for ureports""" __docformat__ = "restructuredtext en" from logilab.common.ureports.html_writer import HTMLWriter class DocbookWriter(HTMLWriter): """format layouts as HTML""" def begin_format(self, layout): """begin to format a layout""" super(HTMLWriter, self).begin_format(layout) if self.snippet is None: self.writeln('') self.writeln( """ """ ) def end_format(self, layout): """finished to format a layout""" if self.snippet is None: self.writeln("") def visit_section(self, layout): """display a section (using (level 0) or
)""" if self.section == 0: tag = "chapter" else: tag = "section" self.section += 1 self.writeln(self._indent(f"<{tag}{self.handle_attrs(layout)}>")) self.format_children(layout) self.writeln(self._indent(f"")) self.section -= 1 def visit_title(self, layout): """display a title using """ self.write(self._indent(f" <title{self.handle_attrs(layout)}>")) self.format_children(layout) self.writeln("") def visit_table(self, layout): """display a table as html""" self.writeln( self._indent(f" {layout.title}") ) self.writeln(self._indent(f' ')) for i in range(layout.cols): self.writeln(self._indent(f' ')) table_content = self.get_table_content(layout) # write headers if layout.cheaders: self.writeln(self._indent(" ")) self._write_row(table_content[0]) self.writeln(self._indent(" ")) table_content = table_content[1:] elif layout.rcheaders: self.writeln(self._indent(" ")) self._write_row(table_content[-1]) self.writeln(self._indent(" ")) table_content = table_content[:-1] # write body self.writeln(self._indent(" ")) for i in range(len(table_content)): row = table_content[i] self.writeln(self._indent(" ")) for j in range(len(row)): cell = row[j] or " " self.writeln(self._indent(f" {cell}")) self.writeln(self._indent(" ")) self.writeln(self._indent(" ")) self.writeln(self._indent(" ")) self.writeln(self._indent(" ")) def _write_row(self, row): """write content of row (using )""" self.writeln(" ") for j in range(len(row)): cell = row[j] or " " self.writeln(f" {cell}") self.writeln(self._indent(" ")) def visit_list(self, layout): """display a list (using )""" self.writeln(self._indent(f" ")) for row in list(self.compute_content(layout)): self.writeln(f" {row}") self.writeln(self._indent(" ")) def visit_paragraph(self, layout): """display links (using )""" self.write(self._indent(" ")) self.format_children(layout) self.writeln("") def visit_span(self, layout): """display links (using

)""" # TODO: translate in docbook self.write(f"") self.format_children(layout) self.write("") def visit_link(self, layout): """display links (using )""" self.write( '{}'.format( layout.url, self.handle_attrs(layout), layout.label ) ) def visit_verbatimtext(self, layout): """display verbatim text (using )""" self.writeln(self._indent(" ")) self.write(layout.data.replace("&", "&").replace("<", "<")) self.writeln(self._indent(" ")) def visit_text(self, layout): """add some text""" self.write(layout.data.replace("&", "&").replace("<", "<")) def _indent(self, string): """correctly indent string according to section""" return " " * 2 * (self.section) + string ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/ureports/html_writer.py0000666000000000000000000001201314762603732022742 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it # under the terms of the GNU Lesser General Public License as published by the # Free Software Foundation, either version 2.1 of the License, # or (at your option) any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with logilab-common. If not, see . """HTML formatting drivers for ureports""" __docformat__ = "restructuredtext en" from logilab.common.ureports import BaseWriter from logilab.common.ureports.nodes import ( Section, Title, Table, List, Paragraph, Link, VerbatimText, Text, ) from typing import Any class HTMLWriter(BaseWriter): """format layouts as HTML""" def __init__(self, snippet: int = None) -> None: super(HTMLWriter, self).__init__() self.snippet = snippet def handle_attrs(self, layout: Any) -> str: """get an attribute string from layout member attributes""" attrs = "" klass = getattr(layout, "klass", None) if klass: attrs += f' class="{klass}"' nid = getattr(layout, "id", None) if nid: attrs += f' id="{nid}"' return attrs def begin_format(self, layout: Any) -> None: """begin to format a layout""" super(HTMLWriter, self).begin_format(layout) if self.snippet is None: self.writeln("") self.writeln("") def end_format(self, layout: Any) -> None: """finished to format a layout""" if self.snippet is None: self.writeln("") self.writeln("") def visit_section(self, layout: Section) -> None: """display a section as html, using div + h[section level]""" self.section += 1 self.writeln(f"") self.format_children(layout) self.writeln("") self.section -= 1 def visit_title(self, layout: Title) -> None: """display a title using """ self.write(f"") self.format_children(layout) self.writeln(f"") def visit_table(self, layout: Table) -> None: """display a table as html""" self.writeln(f"") table_content = self.get_table_content(layout) for i in range(len(table_content)): row = table_content[i] if i == 0 and layout.rheaders: self.writeln('') elif i + 1 == len(table_content) and layout.rrheaders: self.writeln('') else: self.writeln('' % (i % 2 and "even" or "odd")) for j in range(len(row)): cell = row[j] or " " if ( (layout.rheaders and i == 0) or (layout.cheaders and j == 0) or (layout.rrheaders and i + 1 == len(table_content)) or (layout.rcheaders and j + 1 == len(row)) ): self.writeln(f"{cell}") else: self.writeln(f"{cell}") self.writeln("") self.writeln("") def visit_list(self, layout: List) -> None: """display a list as html""" self.writeln(f"") for row in list(self.compute_content(layout)): self.writeln(f"

  • {row}
  • ") self.writeln("") def visit_paragraph(self, layout: Paragraph) -> None: """display links (using

    )""" self.write("

    ") self.format_children(layout) self.write("

    ") def visit_span(self, layout): """display links (using

    )""" self.write(f"") self.format_children(layout) self.write("") def visit_link(self, layout: Link) -> None: """display links (using )""" self.write(f' {layout.label}') def visit_verbatimtext(self, layout: VerbatimText) -> None: """display verbatim text (using

    )"""
            self.write("
    ")
            self.write(layout.data.replace("&", "&").replace("<", "<"))
            self.write("
    ") def visit_text(self, layout: Text) -> None: """add some text""" data = layout.data if layout.escaped: data = data.replace("&", "&").replace("<", "<") self.write(data) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/ureports/nodes.py0000666000000000000000000001520014762603732021513 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it # under the terms of the GNU Lesser General Public License as published by the # Free Software Foundation, either version 2.1 of the License, # or (at your option) any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with logilab-common. If not, see . """Micro reports objects. A micro report is a tree of layout and content objects. """ __docformat__ = "restructuredtext en" from logilab.common.tree import VNode from typing import Optional # from logilab.common.ureports.nodes import List # from logilab.common.ureports.nodes import Paragraph # from logilab.common.ureports.nodes import Text from typing import Any from typing import List as TypingList from typing import Tuple from typing import Union class BaseComponent(VNode): """base report component attributes * id : the component's optional id * klass : the component's optional klass """ def __init__(self, id: Optional[str] = None, klass: Optional[str] = None) -> None: VNode.__init__(self, id) self.klass = klass class BaseLayout(BaseComponent): """base container node attributes * BaseComponent attributes * children : components in this table (i.e. the table's cells) """ def __init__( self, children: Union[ TypingList["Text"], Tuple[Union["Paragraph", str], Union[TypingList, str]], Tuple[str, ...], ] = (), **kwargs: Any, ) -> None: super(BaseLayout, self).__init__(**kwargs) for child in children: if isinstance(child, BaseComponent): self.append(child) else: # mypy: Argument 1 to "add_text" of "BaseLayout" has incompatible type # mypy: "Union[str, List[Any]]"; expected "str" # we check this situation in the if self.add_text(child) # type: ignore def append(self, child: Any) -> None: """overridden to detect problems easily""" assert child not in self.parents() VNode.append(self, child) def parents(self) -> TypingList: """return the ancestor nodes""" assert self.parent is not self if self.parent is None: return [] return [self.parent] + self.parent.parents() def add_text(self, text: str) -> None: """shortcut to add text data""" self.children.append(Text(text)) # non container nodes ######################################################### class Text(BaseComponent): """a text portion attributes : * BaseComponent attributes * data : the text value as an encoded or unicode string """ def __init__(self, data: str, escaped: bool = True, **kwargs: Any) -> None: super(Text, self).__init__(**kwargs) # if isinstance(data, unicode): # data = data.encode('ascii') assert isinstance(data, str), data.__class__ self.escaped = escaped self.data = data class VerbatimText(Text): """a verbatim text, display the raw data attributes : * BaseComponent attributes * data : the text value as an encoded or unicode string """ class Link(BaseComponent): """a labelled link attributes : * BaseComponent attributes * url : the link's target (REQUIRED) * label : the link's label as a string (use the url by default) """ def __init__(self, url: str, label: str = None, **kwargs: Any) -> None: super(Link, self).__init__(**kwargs) assert url self.url = url self.label = label or url class Image(BaseComponent): """an embedded or a single image attributes : * BaseComponent attributes * filename : the image's filename (REQUIRED) * stream : the stream object containing the image data (REQUIRED) * title : the image's optional title """ def __init__(self, filename, stream, title=None, **kwargs): super(Image, self).__init__(**kwargs) assert filename assert stream self.filename = filename self.stream = stream self.title = title # container nodes ############################################################# class Section(BaseLayout): """a section attributes : * BaseLayout attributes a title may also be given to the constructor, it'll be added as a first element a description may also be given to the constructor, it'll be added as a first paragraph """ def __init__(self, title: str = None, description: str = None, **kwargs: Any) -> None: super(Section, self).__init__(**kwargs) if description: self.insert(0, Paragraph([Text(description)])) if title: self.insert(0, Title(children=(title,))) class Title(BaseLayout): """a title attributes : * BaseLayout attributes A title must not contains a section nor a paragraph! """ class Span(BaseLayout): """a title attributes : * BaseLayout attributes A span should only contains Text and Link nodes (in-line elements) """ class Paragraph(BaseLayout): """a simple text paragraph attributes : * BaseLayout attributes A paragraph must not contains a section ! """ class Table(BaseLayout): """some tabular data attributes : * BaseLayout attributes * cols : the number of columns of the table (REQUIRED) * rheaders : the first row's elements are table's header * cheaders : the first col's elements are table's header * title : the table's optional title """ def __init__( self, cols: int, title: Optional[Any] = None, rheaders: int = 0, cheaders: int = 0, rrheaders: int = 0, rcheaders: int = 0, **kwargs: Any, ) -> None: super(Table, self).__init__(**kwargs) assert isinstance(cols, int) self.cols = cols self.title = title self.rheaders = rheaders self.cheaders = cheaders self.rrheaders = rrheaders self.rcheaders = rcheaders class List(BaseLayout): """some list data attributes : * BaseLayout attributes """ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/ureports/text_writer.py0000666000000000000000000001271314762603732022771 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it # under the terms of the GNU Lesser General Public License as published by the # Free Software Foundation, either version 2.1 of the License, # or (at your option) any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with logilab-common. If not, see . """Text formatting drivers for ureports""" from typing import Any, List, Tuple __docformat__ = "restructuredtext en" from logilab.common.textutils import linesep from logilab.common.ureports import BaseWriter from logilab.common.ureports.nodes import ( Section, Title, Table, List as NodeList, Paragraph, Link, VerbatimText, Text, ) TITLE_UNDERLINES = ["", "=", "-", "`", ".", "~", "^"] BULLETS = ["*", "-"] class TextWriter(BaseWriter): """format layouts as text (ReStructured inspiration but not totally handled yet) """ def begin_format(self, layout: Any) -> None: super(TextWriter, self).begin_format(layout) self.list_level = 0 self.pending_urls: List[Tuple[str, str]] = [] def visit_section(self, layout: Section) -> None: """display a section as text""" self.section += 1 self.writeln() self.format_children(layout) if self.pending_urls: self.writeln() for label, url in self.pending_urls: self.writeln(f".. _`{label}`: {url}") self.pending_urls = [] self.section -= 1 self.writeln() def visit_title(self, layout: Title) -> None: title = "".join(list(self.compute_content(layout))) self.writeln(title) try: self.writeln(TITLE_UNDERLINES[self.section] * len(title)) except IndexError: print("FIXME TITLE TOO DEEP. TURNING TITLE INTO TEXT") def visit_paragraph(self, layout: "Paragraph") -> None: """enter a paragraph""" self.format_children(layout) self.writeln() def visit_span(self, layout): """enter a span""" self.format_children(layout) def visit_table(self, layout: Table) -> None: """display a table as text""" table_content = self.get_table_content(layout) # get columns width cols_width = [0] * len(table_content[0]) for row in table_content: for index in range(len(row)): col = row[index] cols_width[index] = max(cols_width[index], len(col)) if layout.klass == "field": self.field_table(layout, table_content, cols_width) else: self.default_table(layout, table_content, cols_width) self.writeln() def default_table( self, layout: Table, table_content: List[List[str]], cols_width: List[int] ) -> None: """format a table""" cols_width = [size + 1 for size in cols_width] format_strings = " ".join(["%%-%ss"] * len(cols_width)) format_strings = format_strings % tuple(cols_width) format_strings_list = format_strings.split(" ") table_linesep = "\n+" + "+".join(["-" * w for w in cols_width]) + "+\n" headsep = "\n+" + "+".join(["=" * w for w in cols_width]) + "+\n" # FIXME: layout.cheaders self.write(table_linesep) for i in range(len(table_content)): self.write("|") line = table_content[i] for j in range(len(line)): self.write(format_strings_list[j] % line[j]) self.write("|") if i == 0 and layout.rheaders: self.write(headsep) else: self.write(table_linesep) def field_table( self, layout: Table, table_content: List[List[str]], cols_width: List[int] ) -> None: """special case for field table""" assert layout.cols == 2 format_string = "%s%%-%ss: %%s" % (linesep, cols_width[0]) for field, value in table_content: self.write(format_string % (field, value)) def visit_list(self, layout: NodeList) -> None: """display a list layout as text""" bullet = BULLETS[self.list_level % len(BULLETS)] indent = " " * self.list_level self.list_level += 1 for child in layout.children: self.write(f"{linesep}{indent}{bullet} ") child.accept(self) self.list_level -= 1 def visit_link(self, layout: Link) -> None: """add a hyperlink""" if layout.label != layout.url: self.write(f"`{layout.label}`_") self.pending_urls.append((layout.label, layout.url)) else: self.write(layout.url) def visit_verbatimtext(self, layout: VerbatimText) -> None: """display a verbatim layout as text (so difficult ;)""" self.writeln("::\n") for line in layout.data.splitlines(): self.writeln(" " + line) self.writeln() def visit_text(self, layout: Text) -> None: """add some text""" self.write(f"{layout.data}") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/urllib2ext.py0000666000000000000000000000642014762603732020620 0ustar00rootrootimport logging import urllib2 import kerberos as krb import re RGX = re.compile(r"(?:.*,)*\s*Negotiate\s*([^,]*),?", re.I) class GssapiAuthError(Exception): """raised on error during authentication process""" def get_negociate_value(headers): for authreq in headers.getheaders("www-authenticate"): match = RGX.search(authreq) if match: return match.group(1) class HTTPGssapiAuthHandler(urllib2.BaseHandler): """Negotiate HTTP authentication using context from GSSAPI""" handler_order = 400 # before Digest Auth def __init__(self): self._reset() def _reset(self): self._retried = 0 self._context = None def clean_context(self): if self._context is not None: krb.authGSSClientClean(self._context) def http_error_401(self, req, fp, code, msg, headers): try: if self._retried > 5: raise urllib2.HTTPError( req.get_full_url(), 401, "negotiate auth failed", headers, None ) self._retried += 1 logging.debug(f"gssapi handler, try {self._retried}") negotiate = get_negociate_value(headers) if negotiate is None: logging.debug("no negociate found in a www-authenticate header") return None logging.debug(f"HTTPGssapiAuthHandler: negotiate 1 is {negotiate!r}") result, self._context = krb.authGSSClientInit(f"HTTP@{req.get_host()}") if result < 1: raise GssapiAuthError("HTTPGssapiAuthHandler: init failed with %d" % result) result = krb.authGSSClientStep(self._context, negotiate) if result < 0: raise GssapiAuthError("HTTPGssapiAuthHandler: step 1 failed with %d" % result) client_response = krb.authGSSClientResponse(self._context) logging.debug(f"HTTPGssapiAuthHandler: client response is {client_response[:10]}...") req.add_unredirected_header("Authorization", f"Negotiate {client_response}") server_response = self.parent.open(req) negotiate = get_negociate_value(server_response.info()) if negotiate is None: logging.warning("HTTPGssapiAuthHandler: failed to authenticate server") else: logging.debug(f"HTTPGssapiAuthHandler negotiate 2: {negotiate}") result = krb.authGSSClientStep(self._context, negotiate) if result < 1: raise GssapiAuthError("HTTPGssapiAuthHandler: step 2 failed with %d" % result) return server_response except GssapiAuthError as exc: logging.error(repr(exc)) finally: self.clean_context() self._reset() if __name__ == "__main__": import sys # debug import httplib httplib.HTTPConnection.debuglevel = 1 httplib.HTTPSConnection.debuglevel = 1 logging.basicConfig(level=logging.DEBUG) # handle cookies import cookielib cj = cookielib.CookieJar() ch = urllib2.HTTPCookieProcessor(cj) # test with url sys.argv[1] h = HTTPGssapiAuthHandler() response = urllib2.build_opener(h, ch).open(sys.argv[1]) print(f"\nresponse: {response.code}\n--------------\n", response.info()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/vcgutils.py0000666000000000000000000001576614762603732020401 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Functions to generate files readable with Georg Sander's vcg (Visualization of Compiler Graphs). You can download vcg at http://rw4.cs.uni-sb.de/~sander/html/gshome.html Note that vcg exists as a debian package. See vcg's documentation for explanation about the different values that maybe used for the functions parameters. """ __docformat__ = "restructuredtext en" import string ATTRS_VAL = { "algos": ( "dfs", "tree", "minbackward", "left_to_right", "right_to_left", "top_to_bottom", "bottom_to_top", "maxdepth", "maxdepthslow", "mindepth", "mindepthslow", "mindegree", "minindegree", "minoutdegree", "maxdegree", "maxindegree", "maxoutdegree", ), "booleans": ("yes", "no"), "colors": ( "black", "white", "blue", "red", "green", "yellow", "magenta", "lightgrey", "cyan", "darkgrey", "darkblue", "darkred", "darkgreen", "darkyellow", "darkmagenta", "darkcyan", "gold", "lightblue", "lightred", "lightgreen", "lightyellow", "lightmagenta", "lightcyan", "lilac", "turquoise", "aquamarine", "khaki", "purple", "yellowgreen", "pink", "orange", "orchid", ), "shapes": ("box", "ellipse", "rhomb", "triangle"), "textmodes": ("center", "left_justify", "right_justify"), "arrowstyles": ("solid", "line", "none"), "linestyles": ("continuous", "dashed", "dotted", "invisible"), } # meaning of possible values: # O -> string # 1 -> int # list -> value in list GRAPH_ATTRS = { "title": 0, "label": 0, "color": ATTRS_VAL["colors"], "textcolor": ATTRS_VAL["colors"], "bordercolor": ATTRS_VAL["colors"], "width": 1, "height": 1, "borderwidth": 1, "textmode": ATTRS_VAL["textmodes"], "shape": ATTRS_VAL["shapes"], "shrink": 1, "stretch": 1, "orientation": ATTRS_VAL["algos"], "vertical_order": 1, "horizontal_order": 1, "xspace": 1, "yspace": 1, "layoutalgorithm": ATTRS_VAL["algos"], "late_edge_labels": ATTRS_VAL["booleans"], "display_edge_labels": ATTRS_VAL["booleans"], "dirty_edge_labels": ATTRS_VAL["booleans"], "finetuning": ATTRS_VAL["booleans"], "manhattan_edges": ATTRS_VAL["booleans"], "smanhattan_edges": ATTRS_VAL["booleans"], "port_sharing": ATTRS_VAL["booleans"], "edges": ATTRS_VAL["booleans"], "nodes": ATTRS_VAL["booleans"], "splines": ATTRS_VAL["booleans"], } NODE_ATTRS = { "title": 0, "label": 0, "color": ATTRS_VAL["colors"], "textcolor": ATTRS_VAL["colors"], "bordercolor": ATTRS_VAL["colors"], "width": 1, "height": 1, "borderwidth": 1, "textmode": ATTRS_VAL["textmodes"], "shape": ATTRS_VAL["shapes"], "shrink": 1, "stretch": 1, "vertical_order": 1, "horizontal_order": 1, } EDGE_ATTRS = { "sourcename": 0, "targetname": 0, "label": 0, "linestyle": ATTRS_VAL["linestyles"], "class": 1, "thickness": 0, "color": ATTRS_VAL["colors"], "textcolor": ATTRS_VAL["colors"], "arrowcolor": ATTRS_VAL["colors"], "backarrowcolor": ATTRS_VAL["colors"], "arrowsize": 1, "backarrowsize": 1, "arrowstyle": ATTRS_VAL["arrowstyles"], "backarrowstyle": ATTRS_VAL["arrowstyles"], "textmode": ATTRS_VAL["textmodes"], "priority": 1, "anchor": 1, "horizontal_order": 1, } # Misc utilities ############################################################### def latin_to_vcg(st): """Convert latin characters using vcg escape sequence.""" for char in st: if char not in string.ascii_letters: try: num = ord(char) if num >= 192: st = st.replace(char, r"\fi%d" % ord(char)) except Exception: pass return st class VCGPrinter: """A vcg graph writer.""" def __init__(self, output_stream): self._stream = output_stream self._indent = "" def open_graph(self, **args): """open a vcg graph""" self._stream.write("%sgraph:{\n" % self._indent) self._inc_indent() self._write_attributes(GRAPH_ATTRS, **args) def close_graph(self): """close a vcg graph""" self._dec_indent() self._stream.write("%s}\n" % self._indent) def node(self, title, **args): """draw a node""" self._stream.write(f'{self._indent}node: {{title:"{title}"') self._write_attributes(NODE_ATTRS, **args) self._stream.write("}\n") def edge(self, from_node, to_node, edge_type="", **args): """draw an edge from a node to another.""" self._stream.write( '%s%sedge: {sourcename:"%s" targetname:"%s"' % (self._indent, edge_type, from_node, to_node) ) self._write_attributes(EDGE_ATTRS, **args) self._stream.write("}\n") # private ################################################################## def _write_attributes(self, attributes_dict, **args): """write graph, node or edge attributes""" for key, value in args.items(): try: _type = attributes_dict[key] except KeyError: raise Exception( f"""no such attribute {key} possible attributes are {attributes_dict.keys()}""" ) if not _type: self._stream.write(f'{self._indent}{key}:"{value}"\n') elif _type == 1: self._stream.write(f"{self._indent}{key}:{int(value)}\n") elif value in _type: self._stream.write(f"{self._indent}{key}:{value}\n") else: raise Exception( f"""value {value} isn't correct for attribute {key} correct values are {_type}""" ) def _inc_indent(self): """increment indentation""" self._indent = f" {self._indent}" def _dec_indent(self): """decrement indentation""" self._indent = self._indent[:-2] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/visitor.py0000666000000000000000000000731714762603732020231 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """A generic visitor abstract implementation.""" from typing import Any, Callable, Optional, Union from logilab.common.types import Node, HTMLWriter, TextWriter __docformat__ = "restructuredtext en" def no_filter(_: Node) -> int: return 1 # Iterators ################################################################### class FilteredIterator: def __init__(self, node: Node, list_func: Callable, filter_func: Optional[Any] = None) -> None: self._next = [(node, 0)] if filter_func is None: filter_func = no_filter self._list = list_func(node, filter_func) def __next__(self) -> Optional[Node]: try: return self._list.pop(0) except Exception: return None next = __next__ # Base Visitor ################################################################ class Visitor: def __init__(self, iterator_class, filter_func=None): self._iter_class = iterator_class self.filter = filter_func def visit(self, node, *args, **kargs): """ launch the visit on a given node call 'open_visit' before the beginning of the visit, with extra args given when all nodes have been visited, call the 'close_visit' method """ self.open_visit(node, *args, **kargs) return self.close_visit(self._visit(node)) def _visit(self, node): iterator = self._get_iterator(node) n = next(iterator) while n: result = n.accept(self) n = next(iterator) return result def _get_iterator(self, node): return self._iter_class(node, self.filter) def open_visit(self, *args, **kargs): """ method called at the beginning of the visit """ def close_visit(self, result): """ method called at the end of the visit """ return result # standard visited mixin ###################################################### class VisitedMixIn: """ Visited interface allow node visitors to use the node """ def get_visit_name(self) -> str: """ return the visit name for the mixed class. When calling 'accept', the method <'visit_' + name returned by this method> will be called on the visitor """ try: # mypy: "VisitedMixIn" has no attribute "TYPE" # dynamic attribute return self.TYPE.replace("-", "_") # type: ignore except Exception: return self.__class__.__name__.lower() def accept( self, visitor: Union[HTMLWriter, TextWriter], *args: Any, **kwargs: Any ) -> Optional[Any]: func = getattr(visitor, f"visit_{self.get_visit_name()}") return func(self, *args, **kwargs) def leave(self, visitor, *args, **kwargs): func = getattr(visitor, f"leave_{self.get_visit_name()}") return func(self, *args, **kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/logilab/common/xmlutils.py0000666000000000000000000000452614762603732020412 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """XML utilities. This module contains useful functions for parsing and using XML data. For the moment, there is only one function that can parse the data inside a processing instruction and return a Python dictionary. """ __docformat__ = "restructuredtext en" import re from typing import Dict, Optional RE_DOUBLE_QUOTE = re.compile(r'([\w\-\.]+)="([^"]+)"') RE_SIMPLE_QUOTE = re.compile(r"([\w\-\.]+)='([^']+)'") def parse_pi_data(pi_data: str) -> Dict[str, Optional[str]]: """ Utility function that parses the data contained in an XML processing instruction and returns a dictionary of keywords and their associated values (most of the time, the processing instructions contain data like ``keyword="value"``, if a keyword is not associated to a value, for example ``keyword``, it will be associated to ``None``). :param pi_data: data contained in an XML processing instruction. :type pi_data: unicode :returns: Dictionary of the keywords (Unicode strings) associated to their values (Unicode strings) as they were defined in the data. :rtype: dict """ results = {} for elt in pi_data.split(): val: Optional[str] double_match = RE_DOUBLE_QUOTE.match(elt) simple_match = RE_SIMPLE_QUOTE.match(elt) if double_match: kwd, val = double_match.groups() elif simple_match: kwd, val = simple_match.groups() else: kwd, val = elt, None results[kwd] = val return results ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.926183 logilab_common-2.1.0/logilab_common.egg-info/0000755000000000000000000000000014762603767017726 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359094.0 logilab_common-2.1.0/logilab_common.egg-info/PKG-INFO0000644000000000000000000000607714762603766021034 0ustar00rootrootMetadata-Version: 2.2 Name: logilab-common Version: 2.1.0 Summary: collection of low-level Python packages and modules used by Logilab projects Home-page: https://forge.extranet.logilab.fr/open-source/logilab-common Author: Logilab Author-email: contact@logilab.fr License: LGPL Classifier: Topic :: Utilities Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3 :: Only Requires-Python: >=3.6 License-File: COPYING License-File: COPYING.LESSER Requires-Dist: setuptools Requires-Dist: mypy-extensions Requires-Dist: typing_extensions Requires-Dist: importlib_metadata<7,>=6; python_version < "3.10" Dynamic: author Dynamic: author-email Dynamic: classifier Dynamic: description Dynamic: home-page Dynamic: license Dynamic: requires-dist Dynamic: requires-python Dynamic: summary Logilab's common library ======================== What's this ? ------------- This package contains some modules used by different Logilab projects. It is released under the GNU Lesser General Public License. There is no documentation available yet but the source code should be clean and well documented. Designed to ease: * handling command line options and configuration files * writing interactive command line tools * manipulation of files and character strings * manipulation of common structures such as graph, tree, and pattern such as visitor * generating text and HTML reports * more... Documentation ------------- Documentation is available at https://logilab-common.readthedocs.io/ Installation ------------ logilab-common is available on pypi so you can install it using pip :: pip install logilab-common Or alternatively extract the tarball, jump into the created directory and run :: python setup.py install For installation options, see :: python setup.py install --help Building the documentation -------------------------- Create a virtualenv and install dependencies :: virtualenv venv source venv/bin/activate # you need the krb5-config command to build all dependencies # on debian you can get it using "apt-get install libkrb5-dev" pip install doc/requirements-doc.txt # install logilab-common pip install -e . Then build the doc :: cd doc make html It's now available under `doc/_build/html/` Code style ---------- The python code is verified against *flake8* and formatted with *black*. * You can run `tox -e black` to check that the files are well formatted. * You can run `tox -e black-run` to format them if needed. * You can include the `.hgrc` to your own `.hgrc` to automatically run black before each commit/amend. This can be done by writing `%include ../.hgrc` at the end of your `.hgrc`. Comments, support, bug reports ------------------------------ Project page https://www.logilab.org/project/logilab-common Use the cubicweb-devel at lists.cubicweb.org mailing list. You can subscribe to this mailing list at https://lists.cubicweb.org/mailman/listinfo/cubicweb-devel Archives are available at https://lists.cubicweb.org/pipermail/cubicweb-devel/ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359094.0 logilab_common-2.1.0/logilab_common.egg-info/SOURCES.txt0000644000000000000000000000764414762603766021624 0ustar00rootrootCOPYING COPYING.LESSER ChangeLog MANIFEST.in README.rst __pkginfo__.py announce.txt pyproject.toml requirements-test.txt setup.cfg setup.py tox.ini bin/logilab-pytest bin/logilab-pytest.bat docs/Makefile docs/changelog.rst docs/conf.py docs/index.rst docs/logilab-pytest.1 docs/logilab.common.rst docs/logilab.common.ureports.rst docs/logilab.rst docs/make.bat docs/modules.rst docs/requirements-doc.txt logilab/common/__init__.py logilab/common/cache.py logilab/common/changelog.py logilab/common/clcommands.py logilab/common/compat.py logilab/common/configuration.py logilab/common/daemon.py logilab/common/date.py logilab/common/debugger.py logilab/common/decorators.py logilab/common/deprecation.py logilab/common/fileutils.py logilab/common/graph.py logilab/common/interface.py logilab/common/logging_ext.py logilab/common/modutils.py logilab/common/optik_ext.py logilab/common/proc.py logilab/common/py.typed logilab/common/registry.py logilab/common/shellutils.py logilab/common/sphinx_ext.py logilab/common/sphinxutils.py logilab/common/table.py logilab/common/tasksqueue.py logilab/common/testlib.py logilab/common/textutils.py logilab/common/tree.py logilab/common/types.py logilab/common/umessage.py logilab/common/urllib2ext.py logilab/common/vcgutils.py logilab/common/visitor.py logilab/common/xmlutils.py logilab/common/ureports/__init__.py logilab/common/ureports/docbook_writer.py logilab/common/ureports/html_writer.py logilab/common/ureports/nodes.py logilab/common/ureports/text_writer.py logilab_common.egg-info/PKG-INFO logilab_common.egg-info/SOURCES.txt logilab_common.egg-info/dependency_links.txt logilab_common.egg-info/namespace_packages.txt logilab_common.egg-info/requires.txt logilab_common.egg-info/top_level.txt test/test_cache.py test/test_changelog.py test/test_configuration.py test/test_date.py test/test_decorators.py test/test_deprecation.py test/test_fileutils.py test/test_graph.py test/test_interface.py test/test_modutils.py test/test_registry.py test/test_shellutils.py test/test_table.py test/test_taskqueue.py test/test_testlib.py test/test_textutils.py test/test_tree.py test/test_umessage.py test/test_ureports_html.py test/test_ureports_text.py test/test_xmlutils.py test/utils.py test/data/ChangeLog test/data/MyPyPa-0.1.0-py2.5.egg test/data/MyPyPa-0.1.0.zip test/data/__init__.py test/data/__pkginfo__.py test/data/deprecation.py test/data/foo.txt test/data/module.py test/data/module2.py test/data/newlines.txt test/data/noendingnewline.py test/data/nonregr.py test/data/normal_file.txt test/data/regobjects.py test/data/regobjects2.py test/data/spam.txt test/data/test.ini test/data/test1.msg test/data/test2.msg test/data/write_protected_file.txt test/data/content_differ_dir/NOTHING test/data/content_differ_dir/README test/data/content_differ_dir/subdir/coin test/data/content_differ_dir/subdir/toto.txt test/data/file_differ_dir/NOTHING test/data/file_differ_dir/README test/data/file_differ_dir/subdir/toto.txt test/data/file_differ_dir/subdirtwo/Hello test/data/find_test/__init__.py test/data/find_test/foo.txt test/data/find_test/module.py test/data/find_test/module2.py test/data/find_test/newlines.txt test/data/find_test/noendingnewline.py test/data/find_test/nonregr.py test/data/find_test/normal_file.txt test/data/find_test/spam.txt test/data/find_test/test.ini test/data/find_test/test1.msg test/data/find_test/test2.msg test/data/find_test/write_protected_file.txt test/data/find_test/sub/doc.txt test/data/find_test/sub/momo.py test/data/lmfp/__init__.py test/data/lmfp/foo.py test/data/reference_dir/NOTHING test/data/reference_dir/README test/data/reference_dir/subdir/coin test/data/reference_dir/subdir/toto.txt test/data/same_dir/NOTHING test/data/same_dir/README test/data/same_dir/subdir/coin test/data/same_dir/subdir/toto.txt test/data/sub/doc.txt test/data/sub/momo.py test/data/subdir_differ_dir/NOTHING test/data/subdir_differ_dir/README test/data/subdir_differ_dir/subdir/coin test/data/subdir_differ_dir/subdir/toto.txt././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359094.0 logilab_common-2.1.0/logilab_common.egg-info/dependency_links.txt0000644000000000000000000000000114762603766023773 0ustar00rootroot ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359094.0 logilab_common-2.1.0/logilab_common.egg-info/namespace_packages.txt0000644000000000000000000000001014762603766024247 0ustar00rootrootlogilab ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359094.0 logilab_common-2.1.0/logilab_common.egg-info/requires.txt0000644000000000000000000000014214762603766022322 0ustar00rootrootsetuptools mypy-extensions typing_extensions [:python_version < "3.10"] importlib_metadata<7,>=6 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359094.0 logilab_common-2.1.0/logilab_common.egg-info/top_level.txt0000644000000000000000000000001014762603766022446 0ustar00rootrootlogilab ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/pyproject.toml0000666000000000000000000000037614762603732016171 0ustar00rootroot[tool.black] line-length = 100 target-version = ['py37'] exclude = '''( \( dist | docs | \.tox | \.hg | \.mypy_cache | \.pytest_cache | __pycache__ | logilab_common.egg-info \) )''' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/requirements-test.txt0000666000000000000000000000002414762603732017504 0ustar00rootrootpytz egenix-mx-base ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.9301832 logilab_common-2.1.0/setup.cfg0000666000000000000000000000020714762603767015077 0ustar00rootroot[bdist_rpm] packager = Sylvain Thenault provides = logilab.common [egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/setup.py0000666000000000000000000000410314762603732014757 0ustar00rootroot#!/usr/bin/env python # pylint: disable=W0404,W0622,W0704,W0613,W0152 # copyright 2003-2010 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it # under the terms of the GNU Lesser General Public License as published by the # Free Software Foundation, either version 2.1 of the License, or (at your # option) any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """Generic Setup script, takes package info from __pkginfo__.py file.""" __docformat__ = "restructuredtext en" from os import path from setuptools import setup, find_namespace_packages here = path.abspath(path.dirname(__file__)) pkginfo = {} with open(path.join(here, "__pkginfo__.py")) as f: exec(f.read(), pkginfo) # Get the long description from the relevant file with open(path.join(here, "README.rst"), encoding="utf-8") as f: long_description = f.read() setup( name=pkginfo["distname"], version=pkginfo["version"], description=pkginfo["description"], long_description=long_description, url=pkginfo["web"], author=pkginfo["author"], author_email=pkginfo["author_email"], license=pkginfo["license"], # See https://pypi.python.org/pypi?%3Aaction=list_classifiers classifiers=pkginfo["classifiers"], packages=find_namespace_packages(include=["logilab"]), package_data={"logilab.common": ["py.typed"]}, include_package_data=True, namespace_packages=[pkginfo["subpackage_of"]], python_requires=">=3.6", install_requires=pkginfo["install_requires"], tests_require=pkginfo["tests_require"], scripts=pkginfo["scripts"], ) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.9021826 logilab_common-2.1.0/test/0000755000000000000000000000000014762603767014232 5ustar00rootroot././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.9101827 logilab_common-2.1.0/test/data/0000755000000000000000000000000014762603767015143 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/ChangeLog0000666000000000000000000000737214762603732016722 0ustar00rootrootChangeLog for logilab.devtools ============================== -- * added the missing dos2unix script to the distribution * major debianize refactoring using class / inheritance instead of functions composition * import the version control library from oobrother extended with code from devtools / apycot * Singing in the rain: - I'm - singing in the rain * Big change multiline tata titi toto - small change - other change - multiline change really ? - Eat your vegetable and brush after every meals 2004-02-13 -- 0.4.5 * fix debianize to handle dependencies to python standalone package (ie no "python" prefix in the default package) * fixed cvslog in rlog mode 2004-02-11 -- 0.4.4 * check web and ftp variables from __pkginfo__ * check for long and short descriptions in __pkginfo__ * outdated copyright is now a warning * consider distuils automaticaly install .c files * fix check_package exit status * merged sgml, elisp and data packages in generated debian files 2003-12-05 -- 0.4.3 * fix bug in buildeb making it usable from buildpackage... 2003-11-24 -- 0.4.2 * fixed pb with check_info_module and catalog, when not launched from the package directory * ignore build directory in check_manifest * fix to avoid pb with "non executed" docstring in pycoverage * add support for --help and fix exit status to pycoverage 2003-11-20 -- 0.4.1 * added code coverage tool, starting from http://www.garethrees.org/2001/12/04/python-coverage/ * added --help option to buildeb 2003-11-14 -- 0.4.0 * added a python script buildeb to build debian package (buildpackage call this script now) * debianize now puts tests in a separated package (-test) and generate package for zope >= 2.6.2 (i.e. python 2.2) * fix detection of examples directory in pkginfo * fix debhelper dependency in build-depends * remove minor bug in buildpackage (try to move archive.gz instead of archive.tar.gz * bug fix in debianize zope handler 2003-10-06 -- 0.3.4 * remove important bug in buildpackage (rm sourcetree when building a source distrib) * add version to dependency between main packages and sub-packages (-data, -elisp and -sgml) * change way of creating the .orig.tar.gz * create source distribution when building debian package * fix path in log message for MANIFEST.in, __pkginfo__ and bin directory * make changelog more robust * debianize bug fixes 2003-09-22 -- 0.3.3 * fix python.postinst script to avoid compiling of others packages :) 2003-09-19 -- 0.3.2 * add basic support for XSLT distribution * fix DTD and catalog handling in debianize * fix bug in check_pkginfo * updated documentation 2003-09-18 -- 0.3.1 * add support for data files in debianize * test python version in debianize * minor fixes * updated setup.py template 2003-09-18 -- 0.3.0 * updates for a new packaging standard * removed jabbercli, cvs_filecheck * added preparedistrib, tagpackage, pkginfo * simpler debianize relying on a generic setup.py * fix some debian templates * checkpackage rewrite * provides checkers for the tester package 2003-08-29 -- 0.2.4 * added cvs_filecheck 2003-06-20 -- 0.2.2 * buildpackages fixes 2003-06-17 -- 0.2.1 * fix setup.py * make pkghandlers.export working with python <= 2.1 * add the mailinglist variable in __pkginfo__, used for announce generation in makedistrib 2003-06-16 -- 0.2.0 * minor enhancements * get package information for __pkginfo__.py ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/MyPyPa-0.1.0-py2.5.egg0000666000000000000000000000230614762603732020310 0ustar00rootrootPKAy:5\Bmypypa/__init__.pyc+Id(`b .) Q@!X $ $ 4@JD||YjQqf~^|H;(I9)I)%z9yffy%z@6)9v APKAy:\sjmypypa/__init__.py/K-*ϋWUP73TPKAy:K[EGG-INFO/SOURCES.txt+N-)-+ӏ,|+*Ru3u=1$C]J*J0RR RRR+s2󲋱**/I-KPKAy:#z| EGG-INFO/top_level.txt˭,,HPKAy:2EGG-INFO/dependency_links.txtPKAy:2EGG-INFO/zip-safePKAy:SA4~EGG-INFO/PKG-INFOM-ILI,I K-*ϳR03KMR HKss*RK22ҹ. """logilab.common packaging information""" from os.path import join __docformat__ = "restructuredtext en" import sys import os distname = "logilab-common" modname = "common" subpackage_of = "logilab" subpackage_master = True numversion = (0, 63, 2) version = ".".join([str(num) for num in numversion]) license = "LGPL" # 2.1 or later description = "collection of low-level Python packages and modules used by Logilab projects" web = f"http://www.logilab.org/project/{distname}" mailinglist = "mailto://python-projects@lists.logilab.org" author = "Logilab" author_email = "contact@logilab.fr" scripts = [join("bin", "logilab-pytest")] include_dirs = [join("test", "data")] install_requires = [] tests_require = ["pytz"] if sys.version_info < (2, 7): install_requires.append("unittest2 >= 0.5.1") if os.name == "nt": install_requires.append("colorama") classifiers = [ "Topic :: Utilities", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 3", ] ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.9101827 logilab_common-2.1.0/test/data/content_differ_dir/0000755000000000000000000000000014762603767020772 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/content_differ_dir/NOTHING0000666000000000000000000000000014762603732022005 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/content_differ_dir/README0000666000000000000000000000001214762603732021637 0ustar00rootrootthank you ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.914183 logilab_common-2.1.0/test/data/content_differ_dir/subdir/0000755000000000000000000000000014762603767022262 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/content_differ_dir/subdir/coin0000666000000000000000000000000514762603732023124 0ustar00rootrootbaba ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/content_differ_dir/subdir/toto.txt0000666000000000000000000000612714762603732024012 0ustar00rootrootLorem ipsum dolor sit amet, consectetuer adipisci elit. Necesse qui quidem constituam tantis, et possunt placeat ipsum ex aut iucunde aut facta, aut impediente autem totum unum directam eius tum voluptate sensuum reperiuntur ad ab, quae ac.. Sed eius enim a, tranquillat ob vexetur permagna potius voluptate eo aliae, vivamus esse solis ut non, atomis videatur in ut, mihi litteris si ante vivere, deinde emancipaverat appetendum sine erant ex metu philosophiae fatemur, et magis non corpora ne, maluisti ita locupletiorem medicorum.. Tradere imperitos exiguam in sint saluti temeritate hoc, nullam nec quaerat, eademque vivendum, contra similique. Molestiae qui, tam sic ea honesto, graeca consecutionem voluptate inertissimae sunt, corpora denique fabulis dicere ab et quae ad politus tum in nostris.. Plane pueriliter, hoc affectus quid iis plus videtur dolorem vivere ad esse asperiores.. Quorum si nihilo eram conflixisse nec inpotenti, et bonum ad nostris servare omni, saepe multis, consequantur id, in fructuosam multi quod, voluptatem abducat a tantum sit error ipso si respirare corrupte referuntur, maiorem.. Voluptatem a etiam perspici gravissimas, cuius.. Unum morbis ne esse conscientia tamen conclusionemque notionem, amentur quam, praeclarorum eum consulatu iis invitat solum porro, quidem ad patria, fore res athenis sempiternum alii venire, est mei nam improbis dolorem, permulta timidiores. Et inquam sic familias, sequatur animis quae et quae ea esse, autem impediri quaeque modo inciderint consecutionem expectata, sed severa etiamsi, in egregios temporibus infinito ad artibus, voluptatem aristotele, tandem aliquo industriae collegi timiditatem sibi igitur aut, se cum tranquillitate loquuntur quod nullo, quam suum illustribus fugiendam illis tam consequatur.. Quas maximisque impendere ipsum se petat altera enim ocurreret sibi maxime, possit ea aegritudo aut ulla, et quod sed. Verissimum confirmat accurate totam iisque sequitur aut probabo et et adhibenda, mihi sed ad et quod erga minima rerum eius quod, tale et libidinosarum liber, omnis quae et nunc sicine, nec at aut omnem, sententiae a, repudiandae.. Vero esse crudelis amentur ut, atque facilius vita invitat, delectus excepturi ex libidinum non qua consequi beate quae ratio.. Illa poetis videor requirere, quippiam et autem ut et esset voluptate neque consilia sed voluptatibus est virtutum minima et, interesse exquirere et peccandi quae carere se, angere.. Firme nomine oratio perferendis si voluptates cogitavisse, feci maledici ea vis et, nam quae legantur animum animis temeritate, amicitiam desideraturam tollatur nisi de voluptatem. Ii videri accedit de.. Graeci tum factis ea ea itaque sunt latinis detractis reprehensiones nostrum sola non tantopere perfruique quoque fruenda aptissimum nostrum, pueros graeca qui eruditionem est quae, labore.. Omnia si quaerimus, si praetermissum vero deserunt quia democriti retinere ignoratione, iam de gerendarum vel a maxime provident, in eadem si praeterierunt, certa cibo ut utilitatibus nullo quod voluptatis iis eamque omnia, stare aut, quamquam et, ut illa susceperant legant consiliisque, est sed quantum igitur. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/deprecation.py0000666000000000000000000000011314762603732020001 0ustar00rootroot# placeholder used by unittest_deprecation def moving_target(): pass ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.914183 logilab_common-2.1.0/test/data/file_differ_dir/0000755000000000000000000000000014762603767020237 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/file_differ_dir/NOTHING0000666000000000000000000000000014762603732021252 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/file_differ_dir/README0000666000000000000000000000001214762603732021104 0ustar00rootrootthank you ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.914183 logilab_common-2.1.0/test/data/file_differ_dir/subdir/0000755000000000000000000000000014762603767021527 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/file_differ_dir/subdir/toto.txt0000666000000000000000000000612614762603732023256 0ustar00rootrootLorem ipsum dolor sit amet, consectetuer adipisci elit. Necesse qui quidem constituam tantis, et possunt placeat ipsum ex aut iucunde aut facta, aut impediente autem totum unum directam eius tum voluptate sensuum reperiuntur ad ab, quae ac.. Sed eius enim a, tranquillat ob vexetur permagna potius voluptate eo aliae, vivamus esse solis ut non, atomis videatur in ut, mihi litteris si ante vivere, deinde emancipaverat appetendum sine erant ex metu philosophiae fatemur, et magis non corpora ne, maluisti ita locupletiorem medicorum.. Tradere imperitos exiguam in sint saluti temeritate hoc, nullam nec quaerat, eademque vivendum, contra similique. Molestiae qui, tam sic ea honesto, graeca consecutionem voluptate inertissimae sunt, corpora denique fabulis dicere ab et quae ad politus tum in nostris.. Plane pueriliter, hoc affectus quid iis plus videtur dolorem vivere ad esse asperiores.. Quorum si nihilo eram pedalis pertinax ii minus, referta mediocrem iustitiam acutum quo rerum constringendos ex pondere lucilius essent neglexerit insequitur a tantum sit error ipso si respirare corrupte referuntur, maiorem.. Voluptatem a etiam perspici gravissimas, cuius.. Unum morbis ne esse conscientia tamen conclusionemque notionem, amentur quam, praeclarorum eum consulatu iis invitat solum porro, quidem ad patria, fore res athenis sempiternum alii venire, est mei nam improbis dolorem, permulta timidiores. Et inquam sic familias, sequatur animis quae et quae ea esse, autem impediri quaeque modo inciderint consecutionem expectata, sed severa etiamsi, in egregios temporibus infinito ad artibus, voluptatem aristotele, tandem aliquo industriae collegi timiditatem sibi igitur aut, se cum tranquillitate loquuntur quod nullo, quam suum illustribus fugiendam illis tam consequatur.. Quas maximisque impendere ipsum se petat altera enim ocurreret sibi maxime, possit ea aegritudo aut ulla, et quod sed. Verissimum confirmat accurate totam iisque sequitur aut probabo et et adhibenda, mihi sed ad et quod erga minima rerum eius quod, tale et libidinosarum liber, omnis quae et nunc sicine, nec at aut omnem, sententiae a, repudiandae.. Vero esse crudelis amentur ut, atque facilius vita invitat, delectus excepturi ex libidinum non qua consequi beate quae ratio.. Illa poetis videor requirere, quippiam et autem ut et esset voluptate neque consilia sed voluptatibus est virtutum minima et, interesse exquirere et peccandi quae carere se, angere.. Firme nomine oratio perferendis si voluptates cogitavisse, feci maledici ea vis et, nam quae legantur animum animis temeritate, amicitiam desideraturam tollatur nisi de voluptatem. Ii videri accedit de.. Graeci tum factis ea ea itaque sunt latinis detractis reprehensiones nostrum sola non tantopere perfruique quoque fruenda aptissimum nostrum, pueros graeca qui eruditionem est quae, labore.. Omnia si quaerimus, si praetermissum vero deserunt quia democriti retinere ignoratione, iam de gerendarum vel a maxime provident, in eadem si praeterierunt, certa cibo ut utilitatibus nullo quod voluptatis iis eamque omnia, stare aut, quamquam et, ut illa susceperant legant consiliisque, est sed quantum igitur. ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.914183 logilab_common-2.1.0/test/data/file_differ_dir/subdirtwo/0000755000000000000000000000000014762603767022261 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/file_differ_dir/subdirtwo/Hello0000666000000000000000000000000014762603732023231 0ustar00rootroot././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1741359094.9181828 logilab_common-2.1.0/test/data/find_test/0000755000000000000000000000000014762603767017122 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/__init__.py0000666000000000000000000000000014762603732021215 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/foo.txt0000666000000000000000000000000014762603732020430 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/module.py0000666000000000000000000000000014762603732020743 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/module2.py0000666000000000000000000000000014762603732021025 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/newlines.txt0000666000000000000000000000000014762603732021471 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/noendingnewline.py0000666000000000000000000000000014762603732022641 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/nonregr.py0000666000000000000000000000000014762603732021130 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/normal_file.txt0000666000000000000000000000000014762603732022134 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/spam.txt0000666000000000000000000000000014762603732020605 0ustar00rootroot././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.922183 logilab_common-2.1.0/test/data/find_test/sub/0000755000000000000000000000000014762603767017713 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/sub/doc.txt0000666000000000000000000000000014762603732021203 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/sub/momo.py0000666000000000000000000000000014762603732021216 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/test.ini0000666000000000000000000000000014762603732020564 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/test1.msg0000666000000000000000000000000014762603732020654 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/test2.msg0000666000000000000000000000000014762603732020655 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/find_test/write_protected_file.txt0000666000000000000000000000000014762603732024047 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/foo.txt0000666000000000000000000000002114762603732016454 0ustar00rootroota b c d e f g h ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.922183 logilab_common-2.1.0/test/data/lmfp/0000755000000000000000000000000014762603767016101 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/lmfp/__init__.py0000666000000000000000000000006314762603732020205 0ustar00rootroot# force a "direct" python import from . import foo ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/lmfp/foo.py0000666000000000000000000000025314762603732017232 0ustar00rootrootimport sys if not getattr(sys, "bar", None): sys.just_once = [] # there used to be two numbers here because # of a load_module_from_path bug sys.just_once.append(42) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/module.py0000666000000000000000000000257114762603732017003 0ustar00rootroot"""test module for astng""" from logilab.common import modutils, Execute as spawn from logilab.common.astutils import * MY_DICT = {} def global_access(key, val): """function test""" local = 1 MY_DICT[key] = val for i in val: if i: del MY_DICT[i] continue else: break else: print("!!!") class YO: """hehe""" a = 1 def __init__(self): try: self.yo = 1 except ValueError as ex: pass except (NameError, TypeError): raise XXXError() except: raise # print('*****>',YO.__dict__) class YOUPI(YO): class_attr = None def __init__(self): self.member = None def method(self): """method test""" global MY_DICT try: MY_DICT = {} local = None autre = [a for a, b in MY_DICT if b] if b in autre: print("yo", end=" ") elif a in autre: print("hehe") global_access(local, val=autre) finally: return local def static_method(): """static method test""" assert MY_DICT, "???" static_method = staticmethod(static_method) def class_method(cls): """class method test""" exec(a, b) class_method = classmethod(class_method) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/module2.py0000666000000000000000000000252614762603732017065 0ustar00rootrootfrom data.module import YO, YOUPI import data class Specialization(YOUPI, YO): pass class Metaclass(type): pass class Interface: pass class MyIFace(Interface): pass class AnotherIFace(Interface): pass class MyException(Exception): pass class MyError(MyException): pass class AbstractClass: def to_override(self, whatever): raise NotImplementedError() def return_something(self, param): if param: return "toto" return class Concrete0: __implements__ = MyIFace class Concrete1: __implements__ = MyIFace, AnotherIFace class Concrete2: __implements__ = (MyIFace, AnotherIFace) class Concrete23(Concrete1): pass del YO.member del YO [SYN1, SYN2] = Concrete0, Concrete1 assert "1" b = 1 | 2 & 3 ^ 8 exec("c = 3") exec("c = 3", {}, {}) def raise_string(a=2, *args, **kwargs): raise "pas glop" raise Exception("yo") yield "coucou" a = b + 2 c = b * 2 c = b / 2 c = b // 2 c = b - 2 c = b % 2 c = b**2 c = b << 2 c = b >> 2 c = ~b c = not b d = [c] e = d[:] e = d[a:b:c] raise_string(*args, **kwargs) print >> stream, "bonjour" print >> stream, "salut", def make_class(any, base=data.module.YO, *args, **kwargs): """check base is correctly resolved to Concrete0""" class Aaaa(base): """dynamic class""" return Aaaa ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/newlines.txt0000666000000000000000000000003114762603732017516 0ustar00rootroot# mixed new lines 1 2 3 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/noendingnewline.py0000666000000000000000000000070214762603732020673 0ustar00rootrootimport unittest class TestCase(unittest.TestCase): def setUp(self): unittest.TestCase.setUp(self) def tearDown(self): unittest.TestCase.tearDown(self) def testIt(self): self.a = 10 self.xxx() def xxx(self): if False: print("a") if False: pass if False: print("rara") if __name__ == "__main__": print("test2") unittest.main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/nonregr.py0000666000000000000000000000044514762603732017166 0ustar00rootroottry: enumerate = enumerate except NameError: def enumerate(iterable): """emulates the python2.3 enumerate() function""" i = 0 for val in iterable: yield i, val i += 1 def toto(value): for k, v in value: print(v.get("yo")) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/normal_file.txt0000666000000000000000000000000014762603732020155 0ustar00rootroot././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.922183 logilab_common-2.1.0/test/data/reference_dir/0000755000000000000000000000000014762603767017737 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/reference_dir/NOTHING0000666000000000000000000000000014762603732020752 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/reference_dir/README0000666000000000000000000000001214762603732020604 0ustar00rootrootthank you ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.922183 logilab_common-2.1.0/test/data/reference_dir/subdir/0000755000000000000000000000000014762603767021227 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/reference_dir/subdir/coin0000666000000000000000000000000514762603732022071 0ustar00rootrootbaba ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/reference_dir/subdir/toto.txt0000666000000000000000000000612614762603732022756 0ustar00rootrootLorem ipsum dolor sit amet, consectetuer adipisci elit. Necesse qui quidem constituam tantis, et possunt placeat ipsum ex aut iucunde aut facta, aut impediente autem totum unum directam eius tum voluptate sensuum reperiuntur ad ab, quae ac.. Sed eius enim a, tranquillat ob vexetur permagna potius voluptate eo aliae, vivamus esse solis ut non, atomis videatur in ut, mihi litteris si ante vivere, deinde emancipaverat appetendum sine erant ex metu philosophiae fatemur, et magis non corpora ne, maluisti ita locupletiorem medicorum.. Tradere imperitos exiguam in sint saluti temeritate hoc, nullam nec quaerat, eademque vivendum, contra similique. Molestiae qui, tam sic ea honesto, graeca consecutionem voluptate inertissimae sunt, corpora denique fabulis dicere ab et quae ad politus tum in nostris.. Plane pueriliter, hoc affectus quid iis plus videtur dolorem vivere ad esse asperiores.. Quorum si nihilo eram pedalis pertinax ii minus, referta mediocrem iustitiam acutum quo rerum constringendos ex pondere lucilius essent neglexerit insequitur a tantum sit error ipso si respirare corrupte referuntur, maiorem.. Voluptatem a etiam perspici gravissimas, cuius.. Unum morbis ne esse conscientia tamen conclusionemque notionem, amentur quam, praeclarorum eum consulatu iis invitat solum porro, quidem ad patria, fore res athenis sempiternum alii venire, est mei nam improbis dolorem, permulta timidiores. Et inquam sic familias, sequatur animis quae et quae ea esse, autem impediri quaeque modo inciderint consecutionem expectata, sed severa etiamsi, in egregios temporibus infinito ad artibus, voluptatem aristotele, tandem aliquo industriae collegi timiditatem sibi igitur aut, se cum tranquillitate loquuntur quod nullo, quam suum illustribus fugiendam illis tam consequatur.. Quas maximisque impendere ipsum se petat altera enim ocurreret sibi maxime, possit ea aegritudo aut ulla, et quod sed. Verissimum confirmat accurate totam iisque sequitur aut probabo et et adhibenda, mihi sed ad et quod erga minima rerum eius quod, tale et libidinosarum liber, omnis quae et nunc sicine, nec at aut omnem, sententiae a, repudiandae.. Vero esse crudelis amentur ut, atque facilius vita invitat, delectus excepturi ex libidinum non qua consequi beate quae ratio.. Illa poetis videor requirere, quippiam et autem ut et esset voluptate neque consilia sed voluptatibus est virtutum minima et, interesse exquirere et peccandi quae carere se, angere.. Firme nomine oratio perferendis si voluptates cogitavisse, feci maledici ea vis et, nam quae legantur animum animis temeritate, amicitiam desideraturam tollatur nisi de voluptatem. Ii videri accedit de.. Graeci tum factis ea ea itaque sunt latinis detractis reprehensiones nostrum sola non tantopere perfruique quoque fruenda aptissimum nostrum, pueros graeca qui eruditionem est quae, labore.. Omnia si quaerimus, si praetermissum vero deserunt quia democriti retinere ignoratione, iam de gerendarum vel a maxime provident, in eadem si praeterierunt, certa cibo ut utilitatibus nullo quod voluptatis iis eamque omnia, stare aut, quamquam et, ut illa susceperant legant consiliisque, est sed quantum igitur. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/regobjects.py0000666000000000000000000000122514762603732017640 0ustar00rootroot"""unittest_registry data file""" from logilab.common.registry import yes, RegistrableObject, RegistrableInstance class Proxy: """annoying object should that not be registered, nor cause error""" def __getattr__(self, attr): return 1 trap = Proxy() class AppObjectClass(RegistrableObject): __registry__ = "zereg" __regid__ = "appobject1" __module__ = "regobjects" __select__ = yes() class AppObjectInstance(RegistrableInstance): __registry__ = "zereg" __module__ = "regobjects" __select__ = yes() def __init__(self, regid): self.__regid__ = regid appobject2 = AppObjectInstance("appobject2") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/regobjects2.py0000666000000000000000000000041314762603732017720 0ustar00rootrootfrom logilab.common.registry import RegistrableObject, RegistrableInstance, yes class MyRegistrableInstance(RegistrableInstance): __regid__ = "appobject3" __select__ = yes() __registry__ = "zereg" instance = MyRegistrableInstance(__module__=__name__) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.922183 logilab_common-2.1.0/test/data/same_dir/0000755000000000000000000000000014762603767016726 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/same_dir/NOTHING0000666000000000000000000000000014762603732017741 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/same_dir/README0000666000000000000000000000001214762603732017573 0ustar00rootrootthank you ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.926183 logilab_common-2.1.0/test/data/same_dir/subdir/0000755000000000000000000000000014762603767020216 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/same_dir/subdir/coin0000666000000000000000000000000514762603732021060 0ustar00rootrootbaba ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/same_dir/subdir/toto.txt0000666000000000000000000000612614762603732021745 0ustar00rootrootLorem ipsum dolor sit amet, consectetuer adipisci elit. Necesse qui quidem constituam tantis, et possunt placeat ipsum ex aut iucunde aut facta, aut impediente autem totum unum directam eius tum voluptate sensuum reperiuntur ad ab, quae ac.. Sed eius enim a, tranquillat ob vexetur permagna potius voluptate eo aliae, vivamus esse solis ut non, atomis videatur in ut, mihi litteris si ante vivere, deinde emancipaverat appetendum sine erant ex metu philosophiae fatemur, et magis non corpora ne, maluisti ita locupletiorem medicorum.. Tradere imperitos exiguam in sint saluti temeritate hoc, nullam nec quaerat, eademque vivendum, contra similique. Molestiae qui, tam sic ea honesto, graeca consecutionem voluptate inertissimae sunt, corpora denique fabulis dicere ab et quae ad politus tum in nostris.. Plane pueriliter, hoc affectus quid iis plus videtur dolorem vivere ad esse asperiores.. Quorum si nihilo eram pedalis pertinax ii minus, referta mediocrem iustitiam acutum quo rerum constringendos ex pondere lucilius essent neglexerit insequitur a tantum sit error ipso si respirare corrupte referuntur, maiorem.. Voluptatem a etiam perspici gravissimas, cuius.. Unum morbis ne esse conscientia tamen conclusionemque notionem, amentur quam, praeclarorum eum consulatu iis invitat solum porro, quidem ad patria, fore res athenis sempiternum alii venire, est mei nam improbis dolorem, permulta timidiores. Et inquam sic familias, sequatur animis quae et quae ea esse, autem impediri quaeque modo inciderint consecutionem expectata, sed severa etiamsi, in egregios temporibus infinito ad artibus, voluptatem aristotele, tandem aliquo industriae collegi timiditatem sibi igitur aut, se cum tranquillitate loquuntur quod nullo, quam suum illustribus fugiendam illis tam consequatur.. Quas maximisque impendere ipsum se petat altera enim ocurreret sibi maxime, possit ea aegritudo aut ulla, et quod sed. Verissimum confirmat accurate totam iisque sequitur aut probabo et et adhibenda, mihi sed ad et quod erga minima rerum eius quod, tale et libidinosarum liber, omnis quae et nunc sicine, nec at aut omnem, sententiae a, repudiandae.. Vero esse crudelis amentur ut, atque facilius vita invitat, delectus excepturi ex libidinum non qua consequi beate quae ratio.. Illa poetis videor requirere, quippiam et autem ut et esset voluptate neque consilia sed voluptatibus est virtutum minima et, interesse exquirere et peccandi quae carere se, angere.. Firme nomine oratio perferendis si voluptates cogitavisse, feci maledici ea vis et, nam quae legantur animum animis temeritate, amicitiam desideraturam tollatur nisi de voluptatem. Ii videri accedit de.. Graeci tum factis ea ea itaque sunt latinis detractis reprehensiones nostrum sola non tantopere perfruique quoque fruenda aptissimum nostrum, pueros graeca qui eruditionem est quae, labore.. Omnia si quaerimus, si praetermissum vero deserunt quia democriti retinere ignoratione, iam de gerendarum vel a maxime provident, in eadem si praeterierunt, certa cibo ut utilitatibus nullo quod voluptatis iis eamque omnia, stare aut, quamquam et, ut illa susceperant legant consiliisque, est sed quantum igitur. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/spam.txt0000666000000000000000000000002114762603732016631 0ustar00rootroota b c h e f g h ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.926183 logilab_common-2.1.0/test/data/sub/0000755000000000000000000000000014762603767015734 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/sub/doc.txt0000666000000000000000000000000714762603732017233 0ustar00rootroothhh ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/sub/momo.py0000666000000000000000000000001414762603732017244 0ustar00rootrootprint("yo") ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.926183 logilab_common-2.1.0/test/data/subdir_differ_dir/0000755000000000000000000000000014762603767020610 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/subdir_differ_dir/NOTHING0000666000000000000000000000000014762603732021623 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/subdir_differ_dir/README0000666000000000000000000000001214762603732021455 0ustar00rootrootthank you ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1741359094.926183 logilab_common-2.1.0/test/data/subdir_differ_dir/subdir/0000755000000000000000000000000014762603767022100 5ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/subdir_differ_dir/subdir/coin0000666000000000000000000000000514762603732022742 0ustar00rootrootbaba ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/subdir_differ_dir/subdir/toto.txt0000666000000000000000000000612614762603732023627 0ustar00rootrootLorem ipsum dolor sit amet, consectetuer adipisci elit. Necesse qui quidem constituam tantis, et possunt placeat ipsum ex aut iucunde aut facta, aut impediente autem totum unum directam eius tum voluptate sensuum reperiuntur ad ab, quae ac.. Sed eius enim a, tranquillat ob vexetur permagna potius voluptate eo aliae, vivamus esse solis ut non, atomis videatur in ut, mihi litteris si ante vivere, deinde emancipaverat appetendum sine erant ex metu philosophiae fatemur, et magis non corpora ne, maluisti ita locupletiorem medicorum.. Tradere imperitos exiguam in sint saluti temeritate hoc, nullam nec quaerat, eademque vivendum, contra similique. Molestiae qui, tam sic ea honesto, graeca consecutionem voluptate inertissimae sunt, corpora denique fabulis dicere ab et quae ad politus tum in nostris.. Plane pueriliter, hoc affectus quid iis plus videtur dolorem vivere ad esse asperiores.. Quorum si nihilo eram pedalis pertinax ii minus, referta mediocrem iustitiam acutum quo rerum constringendos ex pondere lucilius essent neglexerit insequitur a tantum sit error ipso si respirare corrupte referuntur, maiorem.. Voluptatem a etiam perspici gravissimas, cuius.. Unum morbis ne esse conscientia tamen conclusionemque notionem, amentur quam, praeclarorum eum consulatu iis invitat solum porro, quidem ad patria, fore res athenis sempiternum alii venire, est mei nam improbis dolorem, permulta timidiores. Et inquam sic familias, sequatur animis quae et quae ea esse, autem impediri quaeque modo inciderint consecutionem expectata, sed severa etiamsi, in egregios temporibus infinito ad artibus, voluptatem aristotele, tandem aliquo industriae collegi timiditatem sibi igitur aut, se cum tranquillitate loquuntur quod nullo, quam suum illustribus fugiendam illis tam consequatur.. Quas maximisque impendere ipsum se petat altera enim ocurreret sibi maxime, possit ea aegritudo aut ulla, et quod sed. Verissimum confirmat accurate totam iisque sequitur aut probabo et et adhibenda, mihi sed ad et quod erga minima rerum eius quod, tale et libidinosarum liber, omnis quae et nunc sicine, nec at aut omnem, sententiae a, repudiandae.. Vero esse crudelis amentur ut, atque facilius vita invitat, delectus excepturi ex libidinum non qua consequi beate quae ratio.. Illa poetis videor requirere, quippiam et autem ut et esset voluptate neque consilia sed voluptatibus est virtutum minima et, interesse exquirere et peccandi quae carere se, angere.. Firme nomine oratio perferendis si voluptates cogitavisse, feci maledici ea vis et, nam quae legantur animum animis temeritate, amicitiam desideraturam tollatur nisi de voluptatem. Ii videri accedit de.. Graeci tum factis ea ea itaque sunt latinis detractis reprehensiones nostrum sola non tantopere perfruique quoque fruenda aptissimum nostrum, pueros graeca qui eruditionem est quae, labore.. Omnia si quaerimus, si praetermissum vero deserunt quia democriti retinere ignoratione, iam de gerendarum vel a maxime provident, in eadem si praeterierunt, certa cibo ut utilitatibus nullo quod voluptatis iis eamque omnia, stare aut, quamquam et, ut illa susceperant legant consiliisque, est sed quantum igitur. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/test.ini0000666000000000000000000000026214762603732016617 0ustar00rootroot# test configuration [TEST] dothis=yes value=' ' # you can also document the option multiple=yop number=2 #choice renamed=yo multiple-choice=yo,ye [OLD] named=key:val ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/test1.msg0000666000000000000000000000164014762603732016710 0ustar00rootrootFrom Nicolas.Chauvat@logilab.fr Wed Jul 20 12:03:06 2005 Return-Path: X-Original-To: nico@logilab.fr Delivered-To: nico@logilab.fr Received: from logilab.fr (crater.logilab.fr [172.17.1.4]) by orion.logilab.fr (Postfix) with SMTP id 7D3412BDA6 for ; Wed, 20 Jul 2005 12:03:06 +0200 (CEST) Received: (nullmailer pid 8382 invoked by uid 1000); Wed, 20 Jul 2005 10:03:20 -0000 Date: Wed, 20 Jul 2005 12:03:20 +0200 From: Nicolas Chauvat To: Nicolas Chauvat Subject: autre message Message-ID: <20050720100320.GA8371@logilab.fr> Mime-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Disposition: inline Content-Transfer-Encoding: 8bit User-Agent: Mutt/1.5.9i X-Spambayes-Classification: ham; 0.01 Content-Length: 106 Lines: 6 bonjour -- Nicolas Chauvat logilab.fr - services en informatique avancée et gestion de connaissances ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/test2.msg0000666000000000000000000000235014762603732016710 0ustar00rootrootFrom alexandre.fayolle@logilab.fr Wed Jul 27 11:21:57 2005 Date: Wed, 27 Jul 2005 11:21:57 +0200 From: Alexandre =?iso-8859-1?Q?'d=E9couvreur?= de bugs' Fayolle To: =?iso-8859-1?B?6WzpbWVudCDg?= accents Subject: =?iso-8859-1?Q?=C0?= LA MER Message-ID: <20050727092157.GB3923@logilab.fr> Mime-Version: 1.0 Content-Type: multipart/signed; micalg=pgp-sha1; protocol="application/pgp-signature"; boundary="wULyF7TL5taEdwHz" Content-Disposition: inline User-Agent: Mutt/1.5.9i Status: RO Content-Length: 692 Lines: 26 --wULyF7TL5taEdwHz Content-Type: text/plain; charset=iso-8859-1 Content-Disposition: inline Content-Transfer-Encoding: quoted-printable il s'est pass=E9 de dr=F4les de choses.=20 --=20 Alexandre Fayolle LOGILAB, Paris (France). http://www.logilab.com http://www.logilab.fr http://www.logilab.org --wULyF7TL5taEdwHz Content-Type: application/pgp-signature; name="signature.asc" Content-Description: Digital signature Content-Disposition: inline -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.1 (GNU/Linux) iD8DBQFC51I1Ll/b4N9npV4RAsaLAJ4k9C8Hnrjg+Q3ocrUYnYppTVcgyQCeO8yT B7AM5XzlRD1lYqlxq+h80K8= =zfVV -----END PGP SIGNATURE----- --wULyF7TL5taEdwHz-- ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/data/write_protected_file.txt0000666000000000000000000000000014762603732022070 0ustar00rootroot././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_cache.py0000666000000000000000000001243614762603732016710 0ustar00rootroot# unit tests for the cache module # copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . from logilab.common.testlib import TestCase, unittest_main from logilab.common.cache import Cache class CacheTestCase(TestCase): def setUp(self): self.cache = Cache(5) self.testdict = {} def test_setitem1(self): """Checks that the setitem method works""" self.cache[1] = "foo" self.assertEqual(self.cache[1], "foo", "1:foo is not in cache") self.assertEqual(len(self.cache._usage), 1) self.assertEqual(self.cache._usage[-1], 1, "1 is not the most recently used key") self.assertCountEqual( self.cache._usage, self.cache.keys(), "usage list and data keys are different" ) def test_setitem2(self): """Checks that the setitem method works for multiple items""" self.cache[1] = "foo" self.cache[2] = "bar" self.assertEqual(self.cache[2], "bar", "2 : 'bar' is not in cache.data") self.assertEqual(len(self.cache._usage), 2, "lenght of usage list is not 2") self.assertEqual(self.cache._usage[-1], 2, "1 is not the most recently used key") self.assertCountEqual( self.cache._usage, self.cache.keys() ) # usage list and data keys are different def test_setitem3(self): """Checks that the setitem method works when replacing an element in the cache""" self.cache[1] = "foo" self.cache[1] = "bar" self.assertEqual(self.cache[1], "bar", "1 : 'bar' is not in cache.data") self.assertEqual(len(self.cache._usage), 1, "lenght of usage list is not 1") self.assertEqual(self.cache._usage[-1], 1, "1 is not the most recently used key") self.assertCountEqual( self.cache._usage, self.cache.keys() ) # usage list and data keys are different def test_recycling1(self): """Checks the removal of old elements""" self.cache[1] = "foo" self.cache[2] = "bar" self.cache[3] = "baz" self.cache[4] = "foz" self.cache[5] = "fuz" self.cache[6] = "spam" self.assertNotIn(1, self.cache, "key 1 has not been suppressed from the cache dictionnary") self.assertNotIn( 1, self.cache._usage, "key 1 has not been suppressed from the cache LRU list" ) self.assertEqual(len(self.cache._usage), 5, "lenght of usage list is not 5") self.assertEqual(self.cache._usage[-1], 6, "6 is not the most recently used key") self.assertCountEqual( self.cache._usage, self.cache.keys() ) # usage list and data keys are different def test_recycling2(self): """Checks that accessed elements get in the front of the list""" self.cache[1] = "foo" self.cache[2] = "bar" self.cache[3] = "baz" self.cache[4] = "foz" a = self.cache[1] self.assertEqual(a, "foo") self.assertEqual(self.cache._usage[-1], 1, "1 is not the most recently used key") self.assertCountEqual( self.cache._usage, self.cache.keys() ) # usage list and data keys are different def test_delitem(self): """Checks that elements are removed from both element dict and element list. """ self.cache["foo"] = "bar" del self.cache["foo"] self.assertNotIn( "foo", self.cache.keys(), "Element 'foo' was not removed cache dictionnary" ) self.assertNotIn("foo", self.cache._usage, "Element 'foo' was not removed usage list") self.assertCountEqual( self.cache._usage, self.cache.keys() ) # usage list and data keys are different def test_nullsize(self): """Checks that a 'NULL' size cache doesn't store anything""" null_cache = Cache(0) null_cache["foo"] = "bar" self.assertEqual(null_cache.size, 0, "Cache size should be O, not %d" % null_cache.size) self.assertEqual(len(null_cache), 0, "Cache should be empty !") # Assert null_cache['foo'] raises a KeyError self.assertRaises(KeyError, null_cache.__getitem__, "foo") # Deleting element raises a KeyError self.assertRaises(KeyError, null_cache.__delitem__, "foo") def test_getitem(self): """Checks that getitem doest not modify the _usage attribute""" try: self.cache["toto"] except KeyError: self.assertNotIn("toto", self.cache._usage) else: self.fail("excepted KeyError") if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_changelog.py0000666000000000000000000000257014762603732017572 0ustar00rootroot# copyright 2003-2016 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) # any later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with logilab-common. If not, see . from os.path import join, dirname from io import StringIO from logilab.common.testlib import TestCase, unittest_main from logilab.common.changelog import ChangeLog class ChangeLogTC(TestCase): cl_class = ChangeLog cl_file = join(dirname(__file__), "data", "ChangeLog") def test_round_trip(self): cl = self.cl_class(self.cl_file) out = StringIO() cl.write(out) with open(self.cl_file) as stream: self.assertMultiLineEqual(stream.read(), out.getvalue()) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_configuration.py0000666000000000000000000003724014762603732020514 0ustar00rootroot# copyright 2003-2014 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . import tempfile import os from os.path import join, dirname, abspath import re from io import StringIO from sys import version_info from logilab.common import attrdict from logilab.common.testlib import TestCase, unittest_main from logilab.common.optik_ext import OptionValueError from logilab.common.configuration import ( Configuration, OptionError, OptionsManagerMixIn, OptionsProviderMixIn, Method, read_old_config, merge_options, ) DATA = join(dirname(abspath(__file__)), "data") OPTIONS = [ ("dothis", {"type": "yn", "action": "store", "default": True, "metavar": ""}), ("value", {"type": "string", "metavar": "", "short": "v"}), ( "multiple", { "type": "csv", "default": ["yop", "yep"], "metavar": "", "help": "you can also document the option", }, ), ("number", {"type": "int", "default": 2, "metavar": "", "help": "boom"}), ("bytes", {"type": "bytes", "default": "1KB", "metavar": ""}), ("choice", {"type": "choice", "default": "yo", "choices": ("yo", "ye"), "metavar": ""}), ( "multiple-choice", { "type": "multiple_choice", "default": ["yo", "ye"], "choices": ("yo", "ye", "yu", "yi", "ya"), "metavar": "", }, ), ("named", {"type": "named", "default": Method("get_named"), "metavar": ""}), ( "diffgroup", {"type": "string", "default": "pouet", "metavar": "", "group": "agroup"}, ), ("reset-value", {"type": "string", "metavar": "", "short": "r", "dest": "value"}), ("opt-b-1", {"type": "string", "metavar": "", "group": "bgroup"}), ("opt-b-2", {"type": "string", "metavar": "", "group": "bgroup"}), ] class MyConfiguration(Configuration): """test configuration""" def get_named(self): return {"key": "val"} class ConfigurationTC(TestCase): def setUp(self): self.cfg = MyConfiguration(name="test", options=OPTIONS, usage="Just do it ! (tm)") def test_default(self): cfg = self.cfg self.assertEqual(cfg["dothis"], True) self.assertEqual(cfg["value"], None) self.assertEqual(cfg["multiple"], ["yop", "yep"]) self.assertEqual(cfg["number"], 2) self.assertEqual(cfg["bytes"], 1024) self.assertIsInstance(cfg["bytes"], int) self.assertEqual(cfg["choice"], "yo") self.assertEqual(cfg["multiple-choice"], ["yo", "ye"]) self.assertEqual(cfg["named"], {"key": "val"}) def test_base(self): cfg = self.cfg cfg.set_option("number", "0") self.assertEqual(cfg["number"], 0) self.assertRaises(OptionValueError, cfg.set_option, "number", "youpi") self.assertRaises(OptionValueError, cfg.set_option, "choice", "youpi") self.assertRaises(OptionValueError, cfg.set_option, "multiple-choice", ("yo", "y", "ya")) cfg.set_option("multiple-choice", "yo, ya") self.assertEqual(cfg["multiple-choice"], ["yo", "ya"]) self.assertEqual(cfg.get("multiple-choice"), ["yo", "ya"]) self.assertEqual(cfg.get("whatever"), None) def test_load_command_line_configuration(self): cfg = self.cfg args = cfg.load_command_line_configuration( [ "--choice", "ye", "--number", "4", "--multiple=1,2,3", "--dothis=n", "--bytes=10KB", "other", "arguments", ] ) self.assertEqual(args, ["other", "arguments"]) self.assertEqual(cfg["dothis"], False) self.assertEqual(cfg["multiple"], ["1", "2", "3"]) self.assertEqual(cfg["number"], 4) self.assertEqual(cfg["bytes"], 10240) self.assertEqual(cfg["choice"], "ye") self.assertEqual(cfg["value"], None) args = cfg.load_command_line_configuration(["-v", "duh"]) self.assertEqual(args, []) self.assertEqual(cfg["value"], "duh") self.assertEqual(cfg["dothis"], False) self.assertEqual(cfg["multiple"], ["1", "2", "3"]) self.assertEqual(cfg["number"], 4) self.assertEqual(cfg["bytes"], 10240) self.assertEqual(cfg["choice"], "ye") def test_load_configuration(self): cfg = self.cfg cfg.load_configuration( choice="ye", number="4", multiple="1,2,3", dothis="n", multiple_choice=("yo", "ya") ) self.assertEqual(cfg["dothis"], False) self.assertEqual(cfg["multiple"], ["1", "2", "3"]) self.assertEqual(cfg["number"], 4) self.assertEqual(cfg["choice"], "ye") self.assertEqual(cfg["value"], None) self.assertEqual(cfg["multiple-choice"], ("yo", "ya")) def test_load_configuration_file_case_insensitive(self): file = tempfile.mktemp() stream = open(file, "w") try: stream.write( """[Test] dothis=no #value= # you can also document the option multiple=yop,yepii # boom number=3 bytes=1KB choice=yo multiple-choice=yo,ye named=key:val [agroup] diffgroup=zou """ ) stream.close() self.cfg.load_file_configuration(file) self.assertEqual(self.cfg["dothis"], False) self.assertEqual(self.cfg["value"], None) self.assertEqual(self.cfg["multiple"], ["yop", "yepii"]) self.assertEqual(self.cfg["diffgroup"], "zou") finally: os.remove(file) def test_option_order(self): """Check that options are taken into account in the command line order and not in the order they are defined in the Configuration object. """ file = tempfile.mktemp() stream = open(file, "w") try: stream.write( """[Test] reset-value=toto value=tata """ ) stream.close() self.cfg.load_file_configuration(file) finally: os.remove(file) self.assertEqual(self.cfg["value"], "tata") def test_unsupported_options(self): file = tempfile.mktemp() stream = open(file, "w") try: stream.write( """[Test] whatever=toto value=tata """ ) stream.close() self.cfg.load_file_configuration(file) finally: os.remove(file) self.assertEqual(self.cfg["value"], "tata") self.assertRaises(OptionError, self.cfg.__getitem__, "whatever") def test_generate_config(self): stream = StringIO() self.cfg.generate_config(stream) self.assertMultiLineEqual( stream.getvalue().strip(), """[TEST] dothis=yes #value= # you can also document the option multiple=yop,yep # boom number=2 bytes=1KB choice=yo multiple-choice=yo,ye named=key:val #reset-value= [AGROUP] diffgroup=pouet [BGROUP] #opt-b-1= #opt-b-2=""", ) def test_generate_config_header(self): header_message = """This a multiline header All lines should be commented. Everything should be at the top of the file.""" commented_header_message = """# This a multiline header # All lines should be commented. # Everything should be at the top of the file.""" exepected_stream = StringIO() print(commented_header_message, file=exepected_stream) self.cfg.generate_config(exepected_stream) stream = StringIO() self.cfg.generate_config(stream, header_message=header_message) self.assertMultiLineEqual(stream.getvalue().strip(), exepected_stream.getvalue().strip()) def test_generate_config_with_space_string(self): self.cfg["value"] = " " stream = StringIO() self.cfg.generate_config(stream) self.assertMultiLineEqual( stream.getvalue().strip(), """[TEST] dothis=yes value=' ' # you can also document the option multiple=yop,yep # boom number=2 bytes=1KB choice=yo multiple-choice=yo,ye named=key:val reset-value=' ' [AGROUP] diffgroup=pouet [BGROUP] #opt-b-1= #opt-b-2=""", ) def test_generate_config_with_multiline_string(self): self.cfg["value"] = "line1\nline2\nline3" stream = StringIO() self.cfg.generate_config(stream) self.assertMultiLineEqual( stream.getvalue().strip(), """[TEST] dothis=yes value= line1 line2 line3 # you can also document the option multiple=yop,yep # boom number=2 bytes=1KB choice=yo multiple-choice=yo,ye named=key:val reset-value= line1 line2 line3 [AGROUP] diffgroup=pouet [BGROUP] #opt-b-1= #opt-b-2=""", ) def test_roundtrip(self): cfg = self.cfg f = tempfile.mktemp() stream = open(f, "w") try: self.cfg["dothis"] = False self.cfg["multiple"] = ["toto", "tata"] self.cfg["number"] = 3 self.cfg["bytes"] = 2048 cfg.generate_config(stream) stream.close() new_cfg = MyConfiguration(name="test", options=OPTIONS) new_cfg.load_file_configuration(f) self.assertEqual(cfg["dothis"], new_cfg["dothis"]) self.assertEqual(cfg["multiple"], new_cfg["multiple"]) self.assertEqual(cfg["number"], new_cfg["number"]) self.assertEqual(cfg["bytes"], new_cfg["bytes"]) self.assertEqual(cfg["choice"], new_cfg["choice"]) self.assertEqual(cfg["value"], new_cfg["value"]) self.assertEqual(cfg["multiple-choice"], new_cfg["multiple-choice"]) finally: os.remove(f) def test_setitem(self): self.assertRaises(OptionValueError, self.cfg.__setitem__, "multiple-choice", ("a", "b")) self.cfg["multiple-choice"] = ("yi", "ya") self.assertEqual(self.cfg["multiple-choice"], ("yi", "ya")) def test_help(self): self.cfg.add_help_section("bonus", "a nice additional help") help = self.cfg.help().strip() # at least in python 2.4.2 the output is: # ' -v , --value=' # it is not unlikely some optik/optparse versions do print -v # so accept both help = help.replace(" -v , ", " -v, ") help = re.sub("[ ]*(\r?\n)", "\\1", help) USAGE = """Usage: Just do it ! (tm) Options: -h, --help show this help message and exit --dothis= -v, --value= --multiple= you can also document the option [current: yop,yep] --number= boom [current: 2] --bytes= --choice= --multiple-choice= --named= -r , --reset-value= Agroup: --diffgroup= Bgroup: --opt-b-1= --opt-b-2= Bonus: a nice additional help""" if version_info < (2, 5): # 'usage' header is not capitalized in this version USAGE = USAGE.replace("Usage: ", "usage: ") elif version_info < (2, 4): USAGE = """usage: Just do it ! (tm) options: -h, --help show this help message and exit --dothis= -v, --value= --multiple= you can also document the option --number= --choice= --multiple-choice= --named= Bonus: a nice additional help """ self.assertMultiLineEqual(help, USAGE) def test_manpage(self): pkginfo = {} with open(join(DATA, "__pkginfo__.py")) as fobj: exec(fobj.read(), pkginfo) self.cfg.generate_manpage(attrdict(pkginfo), stream=StringIO()) def test_rewrite_config(self): changes = [ ("renamed", "renamed", "choice"), ("moved", "named", "old", "test"), ] read_old_config(self.cfg, changes, join(DATA, "test.ini")) stream = StringIO() self.cfg.generate_config(stream) self.assertMultiLineEqual( stream.getvalue().strip(), """[TEST] dothis=yes value=' ' # you can also document the option multiple=yop # boom number=2 bytes=1KB choice=yo multiple-choice=yo,ye named=key:val reset-value=' ' [AGROUP] diffgroup=pouet [BGROUP] #opt-b-1= #opt-b-2=""", ) class Linter(OptionsManagerMixIn, OptionsProviderMixIn): options = ( ( "profile", {"type": "yn", "metavar": "", "default": False, "help": "Profiled execution."}, ), ) def __init__(self): OptionsManagerMixIn.__init__(self, usage="") OptionsProviderMixIn.__init__(self) self.register_options_provider(self) self.load_provider_defaults() class RegrTC(TestCase): def setUp(self): self.linter = Linter() def test_load_defaults(self): self.linter.load_command_line_configuration([]) self.assertEqual(self.linter.config.profile, False) def test_register_options_multiple_groups(self): """ensure multiple option groups can be registered at once""" config = Configuration() self.assertEqual(config.options, ()) new_options = ( ("option1", {"type": "string", "help": "", "group": "g1", "level": 2}), ("option2", {"type": "string", "help": "", "group": "g1", "level": 2}), ("option3", {"type": "string", "help": "", "group": "g2", "level": 2}), ) config.register_options(new_options) self.assertEqual(config.options, new_options) class MergeTC(TestCase): def test_merge1(self): merged = merge_options( [ ( "dothis", {"type": "yn", "action": "store", "default": True, "metavar": ""}, ), ( "dothis", {"type": "yn", "action": "store", "default": False, "metavar": ""}, ), ] ) self.assertEqual(len(merged), 1) self.assertEqual(merged[0][0], "dothis") self.assertEqual(merged[0][1]["default"], True) def test_merge2(self): merged = merge_options( [ ( "dothis", {"type": "yn", "action": "store", "default": True, "metavar": ""}, ), ("value", {"type": "string", "metavar": "", "short": "v"}), ( "dothis", {"type": "yn", "action": "store", "default": False, "metavar": ""}, ), ] ) self.assertEqual(len(merged), 2) self.assertEqual(merged[0][0], "value") self.assertEqual(merged[1][0], "dothis") self.assertEqual(merged[1][1]["default"], True) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_date.py0000666000000000000000000002063314762603732016560 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """ Unittests for date helpers """ from logilab.common.testlib import TestCase, unittest_main, tag from logilab.common.date import ( date_range, endOfMonth, add_days_worked, nb_open_days, get_national_holidays, ustrftime, ticks2datetime, utcdatetime, datetime2ticks, ) from datetime import date, datetime, timedelta from calendar import timegm import pytz try: from mx.DateTime import ( Date as mxDate, DateTime as mxDateTime, now as mxNow, RelativeDateTime, ) except ImportError: mxDate = mxDateTime = RelativeDateTime = mxNow = None class DateTC(TestCase): datecls = date datetimecls = datetime timedeltacls = timedelta now = datetime.now def test_day(self): """enumerate days""" r = list(date_range(self.datecls(2000, 1, 1), self.datecls(2000, 1, 4))) expected = [self.datecls(2000, 1, 1), self.datecls(2000, 1, 2), self.datecls(2000, 1, 3)] self.assertListEqual(r, expected) r = list(date_range(self.datecls(2000, 1, 31), self.datecls(2000, 2, 3))) expected = [self.datecls(2000, 1, 31), self.datecls(2000, 2, 1), self.datecls(2000, 2, 2)] self.assertListEqual(r, expected) r = list(date_range(self.datecls(2000, 1, 1), self.datecls(2000, 1, 6), 2)) expected = [self.datecls(2000, 1, 1), self.datecls(2000, 1, 3), self.datecls(2000, 1, 5)] self.assertListEqual(r, expected) def test_add_days_worked(self): add = add_days_worked # normal self.assertEqual(add(self.datecls(2008, 1, 3), 1), self.datecls(2008, 1, 4)) # skip week-end self.assertEqual(add(self.datecls(2008, 1, 3), 2), self.datecls(2008, 1, 7)) # skip 2 week-ends self.assertEqual(add(self.datecls(2008, 1, 3), 8), self.datecls(2008, 1, 15)) # skip holiday + week-end self.assertEqual(add(self.datecls(2008, 4, 30), 2), self.datecls(2008, 5, 5)) def test_get_national_holidays(self): holidays = get_national_holidays self.assertEqual( holidays(self.datecls(2008, 4, 29), self.datecls(2008, 5, 2)), [self.datecls(2008, 5, 1)], ) self.assertEqual(holidays(self.datecls(2008, 5, 7), self.datecls(2008, 5, 8)), []) x = self.datetimecls(2008, 5, 7, 12, 12, 12) self.assertEqual(holidays(x, x + self.timedeltacls(days=1)), []) def test_open_days_now_and_before(self): nb = nb_open_days x = self.now() y = x - self.timedeltacls(seconds=1) self.assertRaises(AssertionError, nb, x, y) def assertOpenDays(self, start, stop, expected): got = nb_open_days(start, stop) self.assertEqual(got, expected) def test_open_days_tuesday_friday(self): self.assertOpenDays(self.datecls(2008, 3, 4), self.datecls(2008, 3, 7), 3) def test_open_days_day_nextday(self): self.assertOpenDays(self.datecls(2008, 3, 4), self.datecls(2008, 3, 5), 1) def test_open_days_friday_monday(self): self.assertOpenDays(self.datecls(2008, 3, 7), self.datecls(2008, 3, 10), 1) def test_open_days_friday_monday_with_two_weekends(self): self.assertOpenDays(self.datecls(2008, 3, 7), self.datecls(2008, 3, 17), 6) def test_open_days_tuesday_wednesday(self): """week-end + easter monday""" self.assertOpenDays(self.datecls(2008, 3, 18), self.datecls(2008, 3, 26), 5) def test_open_days_friday_saturday(self): self.assertOpenDays(self.datecls(2008, 3, 7), self.datecls(2008, 3, 8), 1) def test_open_days_friday_sunday(self): self.assertOpenDays(self.datecls(2008, 3, 7), self.datecls(2008, 3, 9), 1) def test_open_days_saturday_sunday(self): self.assertOpenDays(self.datecls(2008, 3, 8), self.datecls(2008, 3, 9), 0) def test_open_days_saturday_monday(self): self.assertOpenDays(self.datecls(2008, 3, 8), self.datecls(2008, 3, 10), 0) def test_open_days_saturday_tuesday(self): self.assertOpenDays(self.datecls(2008, 3, 8), self.datecls(2008, 3, 11), 1) def test_open_days_now_now(self): x = self.now() self.assertOpenDays(x, x, 0) def test_open_days_now_now2(self): x = self.datetimecls(2010, 5, 24) self.assertOpenDays(x, x, 0) def test_open_days_afternoon_before_holiday(self): self.assertOpenDays(self.datetimecls(2008, 5, 7, 14), self.datetimecls(2008, 5, 8, 0), 1) def test_open_days_afternoon_before_saturday(self): self.assertOpenDays(self.datetimecls(2008, 5, 9, 14), self.datetimecls(2008, 5, 10, 14), 1) def test_open_days_afternoon(self): self.assertOpenDays(self.datetimecls(2008, 5, 6, 14), self.datetimecls(2008, 5, 7, 14), 1) @tag("posix", "1900") def test_ustrftime_before_1900(self): date = self.datetimecls(1328, 3, 12, 6, 30) self.assertEqual(ustrftime(date, "%Y-%m-%d %H:%M:%S"), "1328-03-12 06:30:00") @tag("posix", "1900") def test_ticks2datetime_before_1900(self): ticks = -2209075200000 date = ticks2datetime(ticks) self.assertEqual(ustrftime(date, "%Y-%m-%d"), "1899-12-31") def test_month(self): """enumerate months""" r = list(date_range(self.datecls(2006, 5, 6), self.datecls(2006, 8, 27), incmonth=True)) expected = [ self.datecls(2006, 5, 6), self.datecls(2006, 6, 1), self.datecls(2006, 7, 1), self.datecls(2006, 8, 1), ] self.assertListEqual(expected, r) def test_utcdatetime(self): if self.datetimecls is mxDateTime: return d = self.datetimecls(2014, 11, 26, 12, 0, 0, 57, tzinfo=pytz.utc) d = utcdatetime(d) self.assertEqual(d, self.datetimecls(2014, 11, 26, 12, 0, 0, 57)) self.assertIsNone(d.tzinfo) d = pytz.timezone("Europe/Paris").localize(self.datetimecls(2014, 11, 26, 12, 0, 0, 57)) d = utcdatetime(d) self.assertEqual(d, self.datetimecls(2014, 11, 26, 11, 0, 0, 57)) self.assertIsNone(d.tzinfo) d = pytz.timezone("Europe/Paris").localize(self.datetimecls(2014, 7, 26, 12, 0, 0, 57)) d = utcdatetime(d) self.assertEqual(d, self.datetimecls(2014, 7, 26, 10, 0, 0, 57)) self.assertIsNone(d.tzinfo) def test_datetime2ticks(self): d = datetime(2014, 11, 26, 12, 0, 0, 57, tzinfo=pytz.utc) timestamp = timegm(d.timetuple()) self.assertEqual(datetime2ticks(d), timestamp * 1000) d = d.replace(microsecond=123456) self.assertEqual(datetime2ticks(d), timestamp * 1000 + 123) def test_datetime2ticks_date_argument(self): d = date(2014, 11, 26) timestamp = timegm(d.timetuple()) self.assertEqual(datetime2ticks(d), timestamp * 1000) class MxDateTC(DateTC): datecls = mxDate datetimecls = mxDateTime timedeltacls = RelativeDateTime now = mxNow def check_mx(self): if mxDate is None: self.skipTest("mx.DateTime is not installed") def setUp(self): self.check_mx() def test_month(self): """enumerate months""" r = list(date_range(self.datecls(2000, 1, 2), self.datecls(2000, 4, 4), endOfMonth)) expected = [self.datecls(2000, 1, 2), self.datecls(2000, 2, 29), self.datecls(2000, 3, 31)] self.assertListEqual(r, expected) r = list(date_range(self.datecls(2000, 11, 30), self.datecls(2001, 2, 3), endOfMonth)) expected = [ self.datecls(2000, 11, 30), self.datecls(2000, 12, 31), self.datecls(2001, 1, 31), ] self.assertListEqual(r, expected) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_decorators.py0000666000000000000000000001560614762603732020014 0ustar00rootroot# copyright 2003-2013 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """unit tests for the decorators module""" import types from logilab.common.testlib import TestCase, unittest_main from logilab.common.decorators import monkeypatch, cached, clear_cache, copy_cache, cachedproperty class DecoratorsTC(TestCase): def test_monkeypatch_instance_method(self): class MyClass: pass @monkeypatch(MyClass) def meth1(self): return 12 class XXX: @monkeypatch(MyClass) def meth2(self): return 12 # with python3, unbound method are functions self.assertIsInstance(MyClass.meth1, types.FunctionType) self.assertIsInstance(MyClass.meth2, types.FunctionType) self.assertEqual(MyClass().meth1(), 12) self.assertEqual(MyClass().meth2(), 12) def test_monkeypatch_property(self): class MyClass: pass @monkeypatch(MyClass, methodname="prop1") @property def meth1(self): return 12 self.assertIsInstance(MyClass.prop1, property) self.assertEqual(MyClass().prop1, 12) def test_monkeypatch_arbitrary_callable(self): class MyClass: pass class ArbitraryCallable: def __call__(self): return 12 # ensure it complains about missing __name__ with self.assertRaises(AttributeError) as cm: monkeypatch(MyClass)(ArbitraryCallable()) self.assertTrue( str(cm.exception).endswith( "has no __name__ attribute: you should provide an explicit `methodname`" ) ) # ensure no black magic under the hood monkeypatch(MyClass, "foo")(ArbitraryCallable()) self.assertTrue(callable(MyClass.foo)) self.assertEqual(MyClass().foo(), 12) def test_monkeypatch_with_same_name(self): class MyClass: pass @monkeypatch(MyClass) def meth1(self): return 12 self.assertEqual([attr for attr in dir(MyClass) if attr[:2] != "__"], ["meth1"]) inst = MyClass() self.assertEqual(inst.meth1(), 12) def test_monkeypatch_with_custom_name(self): class MyClass: pass @monkeypatch(MyClass, "foo") def meth2(self, param): return param + 12 self.assertEqual([attr for attr in dir(MyClass) if attr[:2] != "__"], ["foo"]) inst = MyClass() self.assertEqual(inst.foo(4), 16) def test_cannot_cache_generator(self): def foo(): yield 42 self.assertRaises(AssertionError, cached, foo) def test_cached_preserves_docstrings_and_name(self): class Foo: @cached def foo(self): """what's up doc ?""" def bar(self, zogzog): """what's up doc ?""" bar = cached(bar, 1) @cached def quux(self, zogzog): """what's up doc ?""" self.assertEqual(Foo.foo.__doc__, """what's up doc ?""") self.assertEqual(Foo.foo.__name__, "foo") self.assertEqual(Foo.bar.__doc__, """what's up doc ?""") self.assertEqual(Foo.bar.__name__, "bar") self.assertEqual(Foo.quux.__doc__, """what's up doc ?""") self.assertEqual(Foo.quux.__name__, "quux") def test_cached_single_cache(self): class Foo: @cached(cacheattr="_foo") def foo(self): """what's up doc ?""" foo = Foo() foo.foo() self.assertTrue(hasattr(foo, "_foo")) clear_cache(foo, "foo") self.assertFalse(hasattr(foo, "_foo")) def test_cached_multi_cache(self): class Foo: @cached(cacheattr="_foo") def foo(self, args): """what's up doc ?""" foo = Foo() foo.foo(1) self.assertEqual(foo._foo, {(1,): None}) clear_cache(foo, "foo") self.assertFalse(hasattr(foo, "_foo")) def test_cached_keyarg_cache(self): class Foo: @cached(cacheattr="_foo", keyarg=1) def foo(self, other, args): """what's up doc ?""" foo = Foo() foo.foo(2, 1) self.assertEqual(foo._foo, {2: None}) clear_cache(foo, "foo") self.assertFalse(hasattr(foo, "_foo")) def test_cached_property(self): class Foo: @property @cached(cacheattr="_foo") def foo(self): """what's up doc ?""" foo = Foo() foo.foo self.assertEqual(foo._foo, None) clear_cache(foo, "foo") self.assertFalse(hasattr(foo, "_foo")) def test_copy_cache(self): class Foo: @cached(cacheattr="_foo") def foo(self, args): """what's up doc ?""" foo = Foo() foo.foo(1) self.assertEqual(foo._foo, {(1,): None}) foo2 = Foo() self.assertFalse(hasattr(foo2, "_foo")) copy_cache(foo2, "foo", foo) self.assertEqual(foo2._foo, {(1,): None}) def test_cachedproperty(self): class Foo: x = 0 @cachedproperty def bar(self): self.__class__.x += 1 return self.__class__.x @cachedproperty def quux(self): """some prop""" return 42 foo = Foo() self.assertEqual(Foo.x, 0) self.assertNotIn("bar", foo.__dict__) self.assertEqual(foo.bar, 1) self.assertIn("bar", foo.__dict__) self.assertEqual(foo.bar, 1) self.assertEqual(foo.quux, 42) self.assertEqual(Foo.bar.__doc__, "") self.assertEqual(Foo.quux.__doc__, "\nsome prop") foo2 = Foo() self.assertEqual(foo2.bar, 2) # make sure foo.foo is cached self.assertEqual(foo.bar, 1) class Kallable: def __call__(self): return 42 self.assertRaises(TypeError, cachedproperty, Kallable()) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_deprecation.py0000666000000000000000000011641514762603732020144 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """unit tests for logilab.common.deprecation""" import os import warnings from inspect import currentframe, getframeinfo from logilab.common.testlib import TestCase, unittest_main from logilab.common.modutils import LazyObject from logilab.common import deprecation CURRENT_FILE = os.path.abspath(__file__) class RawInputTC(TestCase): # XXX with 2.6 we could test warnings # http://docs.python.org/library/warnings.html#testing-warnings # instead we just make sure it does not crash def mock_warn(self, *args, **kwargs): self.messages.append(str(args[0])) def setUp(self): self.messages = [] deprecation.warn = self.mock_warn def tearDown(self): deprecation.warn = warnings.warn def mk_func(self): def any_func(): pass return any_func def test_class_deprecated(self): class AnyClass(object, metaclass=deprecation.class_deprecated): pass AnyClass() class AnyClass(object, metaclass=deprecation.class_deprecated): __deprecation_warning_version__ = "1.2.3" AnyClass() self.assertEqual( self.messages, [ "[test_deprecation] AnyClass is deprecated", "[test_deprecation 1.2.3] AnyClass is deprecated", ], ) def test_class_renamed(self): class AnyClass: pass OldClass = deprecation.class_renamed("OldClass", AnyClass) OldClass() OldClass = deprecation.class_renamed("OldClass", AnyClass, version="1.2.3") OldClass() self.assertEqual( self.messages, [ "[test_deprecation] OldClass is deprecated, use AnyClass instead", "[test_deprecation 1.2.3] OldClass is deprecated, use AnyClass instead", ], ) def test_class_renamed_conflict_metaclass(self): class SomeMetaClass(type): pass class AnyClass(metaclass=SomeMetaClass): pass # make sure the "metaclass conflict: the metaclass of a derived class must be a # (non-strict) subclass of the metaclasses of all its bases" exception won't be raised deprecation.class_renamed("OldClass", AnyClass) deprecation.class_renamed("OldClass", AnyClass, version="1.2.3") def test_class_moved(self): class AnyClass: pass OldClass = deprecation.class_moved(new_class=AnyClass, old_name="OldName") OldClass() OldClass = deprecation.class_moved(new_class=AnyClass, old_name="OldName", version="1.2.3") OldClass() self.assertEqual( self.messages, [ "[test_deprecation] class test_deprecation.OldName is now available as " "test_deprecation.AnyClass", "[test_deprecation 1.2.3] class test_deprecation.OldName is now available as " "test_deprecation.AnyClass", ], ) self.messages = [] AnyClass2 = deprecation.class_moved(new_class=AnyClass) AnyClass2() AnyClass3 = deprecation.class_moved(new_class=AnyClass, version="1.2.3") AnyClass3() self.assertEqual( self.messages, [ "[test_deprecation] class test_deprecation.AnyClass is now available as " "test_deprecation.AnyClass", "[test_deprecation 1.2.3] class test_deprecation.AnyClass is now available as " "test_deprecation.AnyClass", ], ) def test_deprecated_func(self): any_func = deprecation.callable_deprecated()(self.mk_func()) any_func() any_func = deprecation.callable_deprecated("message")(self.mk_func()) any_func() any_func = deprecation.callable_deprecated(version="1.2.3")(self.mk_func()) any_func() any_func = deprecation.callable_deprecated("message", version="1.2.3")(self.mk_func()) any_func() self.assertEqual( self.messages, [ '[test_deprecation] The function "any_func" is deprecated', "[test_deprecation] message", '[test_deprecation 1.2.3] The function "any_func" is deprecated', "[test_deprecation 1.2.3] message", ], ) def test_deprecated_decorator(self): @deprecation.callable_deprecated() def any_func(): pass any_func() @deprecation.callable_deprecated("message") def any_func(): pass any_func() @deprecation.callable_deprecated(version="1.2.3") def any_func(): pass any_func() @deprecation.callable_deprecated("message", version="1.2.3") def any_func(): pass any_func() self.assertEqual( self.messages, [ '[test_deprecation] The function "any_func" is deprecated', "[test_deprecation] message", '[test_deprecation 1.2.3] The function "any_func" is deprecated', "[test_deprecation 1.2.3] message", ], ) def test_deprecated_decorator_bad_lazyobject(self): # this should not raised an ImportationError deprecation.callable_deprecated("foobar")(LazyObject("cubes.localperms", "xperm")) # with or without giving it a message (because it shouldn't access # attributes of the wrapped object before the object is called) deprecation.callable_deprecated()(LazyObject("cubes.localperms", "xperm")) # all of this is done because of the magical way LazyObject is working # and that sometime CW used to use it to do fake import on deprecated # modules to raise a warning if they were used but not importing them # by default. # See: https://forge.extranet.logilab.fr/cubicweb/cubicweb/blob/3.24.0/cubicweb/schemas/__init__.py#L51 # noqa def test_lazy_wraps_function_name(self): """ Avoid conflict from lazy_wraps where __name__ isn't correctly set on the wrapper from the wrapped and we end up with the name of the wrapper instead of the wrapped. Like here it would fail if "check_kwargs" is the name of the new function instead of new_function_name, this is because the wrapper in argument_renamed is called check_kwargs and doesn't transmit the __name__ of the wrapped (new_function_name) correctly. """ @deprecation.argument_renamed(old_name="a", new_name="b") def new_function_name(b): pass old_function_name = deprecation.callable_renamed( old_name="old_function_name", new_function=new_function_name ) old_function_name(None) assert "old_function_name" in self.messages[0] assert "new_function_name" in self.messages[0] assert "check_kwargs" not in self.messages[0] def test_attribute_renamed(self): @deprecation.attribute_renamed(old_name="old", new_name="new") class SomeClass: def __init__(self): self.new = 42 some_class = SomeClass() self.assertEqual(some_class.old, some_class.new) self.assertEqual( self.messages, [ "[test_deprecation] SomeClass.old has been renamed and is deprecated, " "use SomeClass.new instead" ], ) some_class.old = 43 self.assertEqual(some_class.old, 43) self.assertEqual(some_class.old, some_class.new) self.assertTrue(hasattr(some_class, "new")) self.assertTrue(hasattr(some_class, "old")) del some_class.old self.assertFalse(hasattr(some_class, "new")) self.assertFalse(hasattr(some_class, "old")) def test_attribute_renamed_version(self): @deprecation.attribute_renamed(old_name="old", new_name="new", version="1.2.3") class SomeClass: def __init__(self): self.new = 42 some_class = SomeClass() self.assertEqual(some_class.old, some_class.new) self.assertEqual( self.messages, [ "[test_deprecation 1.2.3] SomeClass.old has been renamed and is deprecated, " "use SomeClass.new instead" ], ) some_class.old = 43 self.assertEqual(some_class.old, 43) self.assertEqual(some_class.old, some_class.new) self.assertTrue(hasattr(some_class, "new")) self.assertTrue(hasattr(some_class, "old")) del some_class.old self.assertFalse(hasattr(some_class, "new")) self.assertFalse(hasattr(some_class, "old")) def test_argument_renamed(self): @deprecation.argument_renamed(old_name="old", new_name="new") def some_function(new): return new self.assertEqual(some_function(new=42), 42) self.assertEqual(some_function(old=42), 42) self.assertEqual( self.messages, [ "[test_deprecation] argument old of callable some_function has been renamed and is " "deprecated, use keyword argument new instead" ], ) with self.assertRaises(ValueError): some_function(new=42, old=42) def test_argument_renamed_version(self): @deprecation.argument_renamed(old_name="old", new_name="new", version="1.2.3") def some_function(new): return new self.assertEqual(some_function(new=42), 42) self.assertEqual(some_function(old=42), 42) self.assertEqual( self.messages, [ "[test_deprecation 1.2.3] argument old of callable some_function has been renamed " "and is deprecated, use keyword argument new instead" ], ) with self.assertRaises(ValueError): some_function(new=42, old=42) def test_argument_removed(self): @deprecation.argument_removed("old") def some_function(new): return new self.assertEqual(some_function(new=42), 42) self.assertEqual(some_function(new=10, old=20), 10) self.assertEqual( self.messages, [ "[test_deprecation] argument old of callable some_function has been removed and is " "deprecated" ], ) def test_argument_removed_version(self): @deprecation.argument_removed("old", version="1.2.3") def some_function(new): return new self.assertEqual(some_function(new=42), 42) self.assertEqual(some_function(new=10, old=20), 10) self.assertEqual( self.messages, [ "[test_deprecation 1.2.3] argument old of callable some_function has been removed " "and is deprecated" ], ) def test_callable_renamed(self): def any_func(): pass old_func = deprecation.callable_renamed("old_func", any_func) old_func() self.assertEqual( self.messages, [ "[test_deprecation] old_func has been renamed and is deprecated, " "uses any_func instead" ], ) def test_callable_renamed_version(self): def any_func(): pass old_func = deprecation.callable_renamed("old_func", any_func, version="1.2.3") old_func() self.assertEqual( self.messages, [ "[test_deprecation 1.2.3] old_func has been renamed and is deprecated, " "uses any_func instead" ], ) def test_callable_moved(self): module = "data.deprecation" moving_target = deprecation.callable_moved(module, "moving_target") moving_target() self.assertEqual( self.messages, [ "[test_deprecation] object test_deprecation.moving_target has been moved to " "data.deprecation.moving_target" ], ) def test_callable_moved_version(self): module = "data.deprecation" moving_target = deprecation.callable_moved(module, "moving_target", version="1.2.3") moving_target() self.assertEqual( self.messages, [ "[test_deprecation 1.2.3] object test_deprecation.moving_target has been moved to " "data.deprecation.moving_target" ], ) class StructuredDeprecatedWarningsTest(TestCase): def mock_warn(self, *args, **kwargs): self.collected_warnings.append(args[0]) def setUp(self): self.collected_warnings = [] deprecation.warn = self.mock_warn def tearDown(self): deprecation.warn = warnings.warn def mk_func(self): def any_func(): pass return any_func def test_class_deprecated(self): class AnyClass(metaclass=deprecation.class_deprecated): pass AnyClass() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CLASS) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_class_deprecated_version(self): class AnyClass(metaclass=deprecation.class_deprecated): __deprecation_warning_version__ = "1.2.3" AnyClass() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CLASS) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_class_renamed(self): class AnyClass: pass OldClass = deprecation.class_renamed("OldClass", AnyClass) OldClass() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "OldClass") self.assertEqual(warning.new_name, "AnyClass") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CLASS) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_class_renamed_version(self): class AnyClass: pass OldClass = deprecation.class_renamed("OldClass", AnyClass, version="1.2.3") OldClass() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "OldClass") self.assertEqual(warning.new_name, "AnyClass") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CLASS) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_class_moved(self): class AnyClass: pass OldClass = deprecation.class_moved(new_class=AnyClass, old_name="OldName") OldClass() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.MOVED) self.assertEqual(warning.old_module, "test_deprecation") self.assertEqual(warning.new_module, "test_deprecation") self.assertEqual(warning.old_name, "OldName") self.assertEqual(warning.new_name, "AnyClass") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CLASS) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") self.collected_warnings = [] AnyClass = deprecation.class_moved(new_class=AnyClass) AnyClass() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.MOVED) self.assertEqual(warning.old_module, "test_deprecation") self.assertEqual(warning.new_module, "test_deprecation") self.assertEqual(warning.old_name, "AnyClass") self.assertEqual(warning.new_name, "AnyClass") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CLASS) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_class_moved_version(self): class AnyClass: pass OldClass = deprecation.class_moved(new_class=AnyClass, old_name="OldName", version="1.2.3") OldClass() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.MOVED) self.assertEqual(warning.old_module, "test_deprecation") self.assertEqual(warning.new_module, "test_deprecation") self.assertEqual(warning.old_name, "OldName") self.assertEqual(warning.new_name, "AnyClass") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CLASS) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") self.collected_warnings = [] AnyClass = deprecation.class_moved(new_class=AnyClass, version="1.2.3") AnyClass() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.MOVED) self.assertEqual(warning.old_module, "test_deprecation") self.assertEqual(warning.new_module, "test_deprecation") self.assertEqual(warning.old_name, "AnyClass") self.assertEqual(warning.new_name, "AnyClass") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CLASS) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_deprecated_func(self): any_func = deprecation.callable_deprecated()(self.mk_func()) any_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") any_func = deprecation.callable_deprecated("message")(self.mk_func()) any_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_deprecated_func_version(self): any_func = deprecation.callable_deprecated(version="1.2.3")(self.mk_func()) any_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") any_func = deprecation.callable_deprecated("message", version="1.2.3")(self.mk_func()) any_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_deprecated_decorator(self): @deprecation.callable_deprecated() def any_func(): pass any_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") @deprecation.callable_deprecated("message") def any_func(): pass any_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_deprecated_decorator_version(self): @deprecation.callable_deprecated(version="1.2.3") def any_func(): pass any_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") @deprecation.callable_deprecated("message", version="1.2.3") def any_func(): pass any_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.DEPRECATED) self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_attribute_renamed(self): @deprecation.attribute_renamed(old_name="old", new_name="new") class SomeClass: def __init__(self): self.new = 42 some_class = SomeClass() some_class.old == some_class.new # trigger warning self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old") self.assertEqual(warning.new_name, "new") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ATTRIBUTE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") some_class.old = 43 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old") self.assertEqual(warning.new_name, "new") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ATTRIBUTE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") del some_class.old self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old") self.assertEqual(warning.new_name, "new") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ATTRIBUTE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_attribute_renamed_version(self): @deprecation.attribute_renamed(old_name="old", new_name="new", version="1.2.3") class SomeClass: def __init__(self): self.new = 42 some_class = SomeClass() some_class.old == some_class.new # trigger warning self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old") self.assertEqual(warning.new_name, "new") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ATTRIBUTE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") some_class.old = 43 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old") self.assertEqual(warning.new_name, "new") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ATTRIBUTE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") del some_class.old self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old") self.assertEqual(warning.new_name, "new") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ATTRIBUTE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_argument_renamed(self): @deprecation.argument_renamed(old_name="old", new_name="new") def some_function(new): return new some_function(old=42) self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old") self.assertEqual(warning.new_name, "new") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ARGUMENT) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_argument_renamed_version(self): @deprecation.argument_renamed(old_name="old", new_name="new", version="1.2.3") def some_function(new): return new some_function(old=42) self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old") self.assertEqual(warning.new_name, "new") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ARGUMENT) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_argument_removed(self): @deprecation.argument_removed("old") def some_function(new): return new some_function(new=10, old=20) self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.REMOVED) self.assertEqual(warning.name, "old") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ARGUMENT) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_argument_removed_version(self): @deprecation.argument_removed("old", version="1.2.3") def some_function(new): return new some_function(new=10, old=20) self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.REMOVED) self.assertEqual(warning.name, "old") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.ARGUMENT) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_callable_renamed(self): def any_func(): pass old_func = deprecation.callable_renamed("old_func", any_func) old_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old_func") self.assertEqual(warning.new_name, "any_func") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_callable_renamed_version(self): def any_func(): pass old_func = deprecation.callable_renamed("old_func", any_func, version="1.2.3") old_func() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.RENAMED) self.assertEqual(warning.old_name, "old_func") self.assertEqual(warning.new_name, "any_func") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") def test_callable_moved(self): module = "data.deprecation" moving_target = deprecation.callable_moved(module, "moving_target") moving_target() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.MOVED) self.assertEqual(warning.old_module, "test_deprecation") self.assertEqual(warning.new_module, "data.deprecation") self.assertEqual(warning.old_name, "moving_target") self.assertEqual(warning.new_name, "moving_target") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, None) self.assertEqual(warning.package, "logilab-common") def test_callable_moved_version(self): module = "data.deprecation" moving_target = deprecation.callable_moved(module, "moving_target", version="1.2.3") moving_target() self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() self.assertEqual(warning.operation, deprecation.DeprecationWarningOperation.MOVED) self.assertEqual(warning.old_module, "test_deprecation") self.assertEqual(warning.new_module, "data.deprecation") self.assertEqual(warning.old_name, "moving_target") self.assertEqual(warning.new_name, "moving_target") self.assertEqual(warning.kind, deprecation.DeprecationWarningKind.CALLABLE) self.assertEqual(warning.version, "1.2.3") self.assertEqual(warning.package, "logilab-common") class DeprecatedWarningsTracebackLocationTest(TestCase): def setUp(self): self.catch_warnings = warnings.catch_warnings(record=True) self.collected_warnings = self.catch_warnings.__enter__() def tearDown(self): self.catch_warnings.__exit__() def mk_func(self): def any_func(): pass return any_func def test_class_deprecated(self): class AnyClass(metaclass=deprecation.class_deprecated): pass AnyClass() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_class_renamed(self): class AnyClass: pass OldClass = deprecation.class_renamed("OldClass", AnyClass) OldClass() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_class_moved(self): class AnyClass: pass OldClass = deprecation.class_moved(new_class=AnyClass, old_name="OldName") OldClass() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) AnyClass = deprecation.class_moved(new_class=AnyClass) AnyClass() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_deprecated_func(self): any_func = deprecation.callable_deprecated()(self.mk_func()) any_func() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) any_func = deprecation.callable_deprecated("message")(self.mk_func()) any_func() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_deprecated_decorator(self): @deprecation.callable_deprecated() def any_func(): pass any_func() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) @deprecation.callable_deprecated("message") def any_func(): pass any_func() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_attribute_renamed(self): @deprecation.attribute_renamed(old_name="old", new_name="new") class SomeClass: def __init__(self): self.new = 42 some_class = SomeClass() some_class.old == some_class.new # trigger warning warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) some_class.old = 43 warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) del some_class.old warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_argument_renamed(self): @deprecation.argument_renamed(old_name="old", new_name="new") def some_function(new): return new some_function(old=42) warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_argument_removed(self): @deprecation.argument_removed("old") def some_function(new): return new some_function(new=10, old=20) warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_callable_renamed(self): def any_func(): pass old_func = deprecation.callable_renamed("old_func", any_func) old_func() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) def test_callable_moved(self): module = "data.deprecation" moving_target = deprecation.callable_moved(module, "moving_target") moving_target() warning_line = getframeinfo(currentframe()).lineno - 1 self.assertEqual(len(self.collected_warnings), 1) warning = self.collected_warnings.pop() location = f"{CURRENT_FILE}:{warning_line}" self.assertEqual(f"{warning.filename}:{warning.lineno}", location) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_fileutils.py0000666000000000000000000001356214762603732017646 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """unit tests for logilab.common.fileutils""" import doctest import sys import os import tempfile import shutil from stat import S_IWRITE from os.path import join from logilab.common.testlib import TestCase, unittest_main from logilab.common.fileutils import ( first_level_directory, is_binary, write_open_mode, lines, export, exists, ProtectedFile, ) DATA_DIR = join(os.path.abspath(os.path.dirname(__file__)), "data") NEWLINES_TXT = join(DATA_DIR, "newlines.txt") class FirstleveldirectoryTC(TestCase): def test_known_values_first_level_directory(self): """return the first level directory of a path""" self.assertEqual(first_level_directory("truc/bidule/chouette"), "truc", None) self.assertEqual(first_level_directory("/truc/bidule/chouette"), "/", None) class IsBinaryTC(TestCase): def test(self): self.assertEqual(is_binary("toto.txt"), 0) # self.assertEqual(is_binary('toto.xml'), 0) self.assertEqual(is_binary("toto.bin"), 1) self.assertEqual(is_binary("toto.sxi"), 1) self.assertEqual(is_binary("toto.whatever"), 1) class GetModeTC(TestCase): def test(self): self.assertEqual(write_open_mode("toto.txt"), "w") # self.assertEqual(write_open_mode('toto.xml'), 'w') self.assertEqual(write_open_mode("toto.bin"), "wb") self.assertEqual(write_open_mode("toto.sxi"), "wb") class NormReadTC(TestCase): def test_known_values_norm_read(self): with open(NEWLINES_TXT) as f: data = f.read() self.assertEqual(data.strip(), "\n".join(["# mixed new lines", "1", "2", "3"])) class LinesTC(TestCase): def test_known_values_lines(self): self.assertEqual(lines(NEWLINES_TXT), ["# mixed new lines", "1", "2", "3"]) def test_known_values_lines_comment(self): self.assertEqual(lines(NEWLINES_TXT, comments="#"), ["1", "2", "3"]) class ExportTC(TestCase): def setUp(self): self.tempdir = tempfile.mktemp() os.mkdir(self.tempdir) def test(self): export(DATA_DIR, self.tempdir, verbose=0) self.assertTrue(exists(join(self.tempdir, "__init__.py"))) self.assertTrue(exists(join(self.tempdir, "sub"))) self.assertTrue(not exists(join(self.tempdir, "__init__.pyc"))) self.assertTrue(not exists(join(self.tempdir, "CVS"))) def tearDown(self): shutil.rmtree(self.tempdir) class ProtectedFileTC(TestCase): def setUp(self): self.rpath = join(DATA_DIR, "write_protected_file.txt") self.rwpath = join(DATA_DIR, "normal_file.txt") # Make sure rpath is not writable ! os.chmod(self.rpath, 33060) # Make sure rwpath is writable ! os.chmod(self.rwpath, 33188) def test_mode_change(self): """tests that mode is changed when needed""" # test on non-writable file # self.assertTrue(not os.access(self.rpath, os.W_OK)) self.assertTrue(not os.stat(self.rpath).st_mode & S_IWRITE) # for some reason if you remove "wp_file =" the test fails wp_file = ProtectedFile(self.rpath, "w") # noqa self.assertTrue(os.stat(self.rpath).st_mode & S_IWRITE) self.assertTrue(os.access(self.rpath, os.W_OK)) # test on writable-file self.assertTrue(os.stat(self.rwpath).st_mode & S_IWRITE) self.assertTrue(os.access(self.rwpath, os.W_OK)) ProtectedFile(self.rwpath, "w") self.assertTrue(os.stat(self.rwpath).st_mode & S_IWRITE) self.assertTrue(os.access(self.rwpath, os.W_OK)) def test_restore_on_close(self): """tests original mode is restored on close""" # test on non-writable file # self.assertTrue(not os.access(self.rpath, os.W_OK)) self.assertTrue(not os.stat(self.rpath).st_mode & S_IWRITE) ProtectedFile(self.rpath, "w").close() # self.assertTrue(not os.access(self.rpath, os.W_OK)) self.assertTrue(not os.stat(self.rpath).st_mode & S_IWRITE) # test on writable-file self.assertTrue(os.access(self.rwpath, os.W_OK)) self.assertTrue(os.stat(self.rwpath).st_mode & S_IWRITE) ProtectedFile(self.rwpath, "w").close() self.assertTrue(os.access(self.rwpath, os.W_OK)) self.assertTrue(os.stat(self.rwpath).st_mode & S_IWRITE) def test_mode_change_on_append(self): """tests that mode is changed when file is opened in 'a' mode""" # self.assertTrue(not os.access(self.rpath, os.W_OK)) self.assertTrue(not os.stat(self.rpath).st_mode & S_IWRITE) wp_file = ProtectedFile(self.rpath, "a") self.assertTrue(os.access(self.rpath, os.W_OK)) self.assertTrue(os.stat(self.rpath).st_mode & S_IWRITE) wp_file.close() # self.assertTrue(not os.access(self.rpath, os.W_OK)) self.assertTrue(not os.stat(self.rpath).st_mode & S_IWRITE) if sys.version_info < (3, 0): def load_tests(loader, tests, ignore): from logilab.common import fileutils tests.addTests(doctest.DocTestSuite(fileutils)) return tests if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_graph.py0000666000000000000000000000622714762603732016747 0ustar00rootroot# unit tests for the cache module # copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . from logilab.common.testlib import TestCase, unittest_main from logilab.common.graph import get_cycles, has_path, ordered_nodes, UnorderableGraph class getCyclesTC(TestCase): def test_known0(self): self.assertEqual(get_cycles({1: [2], 2: [3], 3: [1]}), [[1, 2, 3]]) def test_known1(self): self.assertEqual(get_cycles({1: [2], 2: [3], 3: [1, 4], 4: [3]}), [[1, 2, 3], [3, 4]]) def test_known2(self): self.assertEqual(get_cycles({1: [2], 2: [3], 3: [0], 0: []}), []) class hasPathTC(TestCase): def test_direct_connection(self): self.assertEqual(has_path({"A": ["B"], "B": ["A"]}, "A", "B"), ["B"]) def test_indirect_connection(self): self.assertEqual(has_path({"A": ["B"], "B": ["A", "C"], "C": ["B"]}, "A", "C"), ["B", "C"]) def test_no_connection(self): self.assertEqual(has_path({"A": ["B"], "B": ["A"]}, "A", "C"), None) def test_cycle(self): self.assertEqual(has_path({"A": ["A"]}, "A", "B"), None) class ordered_nodesTC(TestCase): def test_one_item(self): graph = {"a": []} ordered = ordered_nodes(graph) self.assertEqual(ordered, ("a",)) def test_single_dependency(self): graph = {"a": ["b"], "b": []} ordered = ordered_nodes(graph) self.assertEqual(ordered, ("a", "b")) graph = {"a": [], "b": ["a"]} ordered = ordered_nodes(graph) self.assertEqual(ordered, ("b", "a")) def test_two_items_no_dependency(self): graph = {"a": [], "b": []} ordered = ordered_nodes(graph) self.assertEqual(ordered, ("a", "b")) def test_three_items_no_dependency(self): graph = {"a": [], "b": [], "c": []} ordered = ordered_nodes(graph) self.assertEqual(ordered, ("a", "b", "c")) def test_three_items_one_dependency(self): graph = {"a": ["c"], "b": [], "c": []} ordered = ordered_nodes(graph) self.assertEqual(ordered, ("a", "b", "c")) def test_three_items_two_dependencies(self): graph = {"a": ["b"], "b": ["c"], "c": []} ordered = ordered_nodes(graph) self.assertEqual(ordered, ("a", "b", "c")) def test_bad_graph(self): graph = {"a": ["b"]} self.assertRaises(UnorderableGraph, ordered_nodes, graph) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_interface.py0000666000000000000000000000523314762603732017602 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . from logilab.common.testlib import TestCase, unittest_main from logilab.common.interface import Interface, extend class IFace1(Interface): pass class IFace2(Interface): pass class IFace3(Interface): pass class A: __implements__ = (IFace1,) class B(A): pass class C1(B): __implements__ = list(B.__implements__) + [IFace3] class C2(B): __implements__ = B.__implements__ + (IFace2,) class D(C1): __implements__ = () class Z: pass class ExtendTC(TestCase): def setUp(self): global aimpl, c1impl, c2impl, dimpl aimpl = A.__implements__ c1impl = C1.__implements__ c2impl = C2.__implements__ dimpl = D.__implements__ def test_base(self): extend(A, IFace2) self.assertEqual(A.__implements__, (IFace1, IFace2)) self.assertEqual(B.__implements__, (IFace1, IFace2)) self.assertIs(B.__implements__, A.__implements__) self.assertEqual(C1.__implements__, [IFace1, IFace3, IFace2]) self.assertEqual(C2.__implements__, (IFace1, IFace2)) self.assertIs(C2.__implements__, c2impl) self.assertEqual(D.__implements__, (IFace2,)) def test_already_impl(self): extend(A, IFace1) self.assertIs(A.__implements__, aimpl) def test_no_impl(self): extend(Z, IFace1) self.assertEqual(Z.__implements__, (IFace1,)) def test_notimpl_explicit(self): extend(C1, IFace3) self.assertIs(C1.__implements__, c1impl) self.assertIs(D.__implements__, dimpl) def test_nonregr_implements_baseinterface(self): class SubIFace(IFace1): pass class X: __implements__ = (SubIFace,) self.assertTrue(SubIFace.is_implemented_by(X)) self.assertTrue(IFace1.is_implemented_by(X)) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_modutils.py0000666000000000000000000000477014762603732017507 0ustar00rootroot# copyright 2003-2013 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """ unit tests for module modutils (module manipulation utilities) """ import doctest import sys import warnings try: __file__ except NameError: __file__ = sys.argv[0] from logilab.common.testlib import TestCase, unittest_main from logilab.common import modutils from os import path from logilab import common warnings.simplefilter("default", DeprecationWarning) sys.path.insert(0, path.dirname(__file__)) DATADIR = path.join(path.dirname(__file__), "data") class ModutilsTestCase(TestCase): def setUp(self): super().setUp() self.__common_in_path = common.__path__[0] in sys.path if self.__common_in_path: sys.path.remove(common.__path__[0]) def tearDown(self): if self.__common_in_path: sys.path.insert(0, common.__path__[0]) super().tearDown() class modpath_from_file_tc(ModutilsTestCase): """given an absolute file path return the python module's path as a list""" def test_knownValues_modpath_from_file_1(self): self.assertEqual( modutils.modpath_from_file(modutils.__file__), ["logilab", "common", "modutils"] ) def test_knownValues_modpath_from_file_2(self): self.assertEqual( modutils.modpath_from_file(__file__, {path.split(__file__)[0]: "arbitrary.pkg"}), ["arbitrary", "pkg", "test_modutils"], ) def test_raise_modpath_from_file_Exception(self): self.assertRaises(Exception, modutils.modpath_from_file, "/turlututu") def load_tests(loader, tests, ignore): from logilab.common import modutils tests.addTests(doctest.DocTestSuite(modutils)) return tests if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_registry.py0000666000000000000000000001536714762603732017523 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of Logilab-Common. # # Logilab-Common is free software: you can redistribute it and/or modify it under the # terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) # any later version. # # Logilab-Common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with Logilab-Common. If not, see . """unit tests for selectors mechanism""" import gc import logging import sys from contextlib import contextmanager from logilab.common.testlib import ( TestCase, unittest_main, ) from logilab.common.registry import ( Predicate, AndPredicate, OrPredicate, wrap_predicates, RegistryStore, RegistrableInstance, ) logging.basicConfig(level=logging.ERROR) class _1_(Predicate): def __call__(self, *args, **kwargs): return 1 class _0_(Predicate): def __call__(self, *args, **kwargs): return 0 def _2_(*args, **kwargs): return 2 class SelectorsTC(TestCase): def test_basic_and(self): selector = _1_() & _1_() self.assertEqual(selector(None), 2) selector = _1_() & _0_() self.assertEqual(selector(None), 0) selector = _0_() & _1_() self.assertEqual(selector(None), 0) def test_basic_or(self): selector = _1_() | _1_() self.assertEqual(selector(None), 1) selector = _1_() | _0_() self.assertEqual(selector(None), 1) selector = _0_() | _1_() self.assertEqual(selector(None), 1) selector = _0_() | _0_() self.assertEqual(selector(None), 0) def test_selector_and_function(self): selector = _1_() & _2_ self.assertEqual(selector(None), 3) selector = _2_ & _1_() self.assertEqual(selector(None), 3) def test_three_and(self): selector = _1_() & _1_() & _1_() self.assertEqual(selector(None), 3) selector = _1_() & _0_() & _1_() self.assertEqual(selector(None), 0) selector = _0_() & _1_() & _1_() self.assertEqual(selector(None), 0) def test_three_or(self): selector = _1_() | _1_() | _1_() self.assertEqual(selector(None), 1) selector = _1_() | _0_() | _1_() self.assertEqual(selector(None), 1) selector = _0_() | _1_() | _1_() self.assertEqual(selector(None), 1) selector = _0_() | _0_() | _0_() self.assertEqual(selector(None), 0) def test_composition(self): selector = (_1_() & _1_()) & (_1_() & _1_()) self.assertIsInstance(selector, AndPredicate) self.assertEqual(len(selector.selectors), 4) self.assertEqual(selector(None), 4) selector = (_1_() & _0_()) | (_1_() & _1_()) self.assertIsInstance(selector, OrPredicate) self.assertEqual(len(selector.selectors), 2) self.assertEqual(selector(None), 2) def test_search_selectors(self): sel = _1_() self.assertIs(sel.search_selector(_1_), sel) csel = AndPredicate(sel, Predicate()) self.assertIs(csel.search_selector(_1_), sel) csel = AndPredicate(Predicate(), sel) self.assertIs(csel.search_selector(_1_), sel) self.assertIs(csel.search_selector((AndPredicate, OrPredicate)), csel) self.assertIs(csel.search_selector((OrPredicate, AndPredicate)), csel) self.assertIs(csel.search_selector((_1_, _0_)), sel) self.assertIs(csel.search_selector((_0_, _1_)), sel) def test_inplace_and(self): selector = _1_() selector &= _1_() selector &= _1_() self.assertEqual(selector(None), 3) selector = _1_() selector &= _0_() selector &= _1_() self.assertEqual(selector(None), 0) selector = _0_() selector &= _1_() selector &= _1_() self.assertEqual(selector(None), 0) selector = _0_() selector &= _0_() selector &= _0_() self.assertEqual(selector(None), 0) def test_inplace_or(self): selector = _1_() selector |= _1_() selector |= _1_() self.assertEqual(selector(None), 1) selector = _1_() selector |= _0_() selector |= _1_() self.assertEqual(selector(None), 1) selector = _0_() selector |= _1_() selector |= _1_() self.assertEqual(selector(None), 1) selector = _0_() selector |= _0_() selector |= _0_() self.assertEqual(selector(None), 0) def test_wrap_selectors(self): class _temp_(Predicate): def __call__(self, *args, **kwargs): return 0 del _temp_ # test weakref s1 = _1_() & _1_() s2 = _1_() & _0_() s3 = _0_() & _1_() gc.collect() self.count = 0 def decorate(f, self=self): def wrapper(*args, **kwargs): self.count += 1 return f(*args, **kwargs) return wrapper wrap_predicates(decorate) self.assertEqual(s1(None), 2) self.assertEqual(s2(None), 0) self.assertEqual(s3(None), 0) self.assertEqual(self.count, 8) @contextmanager def prepended_syspath(path): sys.path.insert(0, path) yield sys.path = sys.path[1:] class RegistryStoreTC(TestCase): def test_autoload_modnames(self): store = RegistryStore() store.setdefault("zereg") with prepended_syspath(self.datadir): store.register_modnames(["regobjects", "regobjects2"]) self.assertEqual(["zereg"], list(store.keys())) self.assertEqual({"appobject1", "appobject2", "appobject3"}, set(store["zereg"])) class RegistrableInstanceTC(TestCase): def test_instance_modulename(self): obj = RegistrableInstance() self.assertEqual(obj.__module__, "logilab.common.registry") # no inheritance obj = RegistrableInstance(__module__=__name__) self.assertEqual(obj.__module__, "test_registry") # with inheritance from another python file with prepended_syspath(self.datadir): from regobjects2 import instance, MyRegistrableInstance instance2 = MyRegistrableInstance(__module__=__name__) self.assertEqual(instance.__module__, "regobjects2") self.assertEqual(instance2.__module__, "test_registry") if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_shellutils.py0000666000000000000000000002314314762603732020032 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """unit tests for logilab.common.shellutils""" from io import StringIO from os.path import join, dirname, abspath from unittest.mock import patch from logilab.common.testlib import TestCase, unittest_main from logilab.common.shellutils import globfind, find, ProgressBar, RawInput DATA_DIR = join(dirname(abspath(__file__)), "data", "find_test") class FindTC(TestCase): def test_include(self): files = set(find(DATA_DIR, ".py")) self.assertSetEqual( files, { join(DATA_DIR, f) for f in [ "__init__.py", "module.py", "module2.py", "noendingnewline.py", "nonregr.py", join("sub", "momo.py"), ] }, ) files = set(find(DATA_DIR, (".py",), blacklist=("sub",))) self.assertSetEqual( files, { join(DATA_DIR, f) for f in [ "__init__.py", "module.py", "module2.py", "noendingnewline.py", "nonregr.py", ] }, ) def test_exclude(self): files = set(find(DATA_DIR, (".py", ".pyc"), exclude=True)) self.assertSetEqual( files, { join(DATA_DIR, f) for f in [ "foo.txt", "newlines.txt", "normal_file.txt", "test.ini", "test1.msg", "test2.msg", "spam.txt", join("sub", "doc.txt"), "write_protected_file.txt", ] }, ) def test_globfind(self): files = set(globfind(DATA_DIR, "*.py")) self.assertSetEqual( files, { join(DATA_DIR, f) for f in [ "__init__.py", "module.py", "module2.py", "noendingnewline.py", "nonregr.py", join("sub", "momo.py"), ] }, ) files = set(globfind(DATA_DIR, "mo*.py")) self.assertSetEqual( files, {join(DATA_DIR, f) for f in ["module.py", "module2.py", join("sub", "momo.py")]}, ) files = set(globfind(DATA_DIR, "mo*.py", blacklist=("sub",))) self.assertSetEqual(files, {join(DATA_DIR, f) for f in ["module.py", "module2.py"]}) class ProgressBarTC(TestCase): def test_refresh(self): pgb_stream = StringIO() expected_stream = StringIO() pgb = ProgressBar(20, stream=pgb_stream) self.assertEqual( pgb_stream.getvalue(), expected_stream.getvalue() ) # nothing print before refresh pgb.refresh() expected_stream.write("\r[" + " " * 20 + "]") self.assertEqual(pgb_stream.getvalue(), expected_stream.getvalue()) def test_refresh_g_size(self): pgb_stream = StringIO() expected_stream = StringIO() pgb = ProgressBar(20, 35, stream=pgb_stream) pgb.refresh() expected_stream.write("\r[" + " " * 35 + "]") self.assertEqual(pgb_stream.getvalue(), expected_stream.getvalue()) def test_refresh_l_size(self): pgb_stream = StringIO() expected_stream = StringIO() pgb = ProgressBar(20, 3, stream=pgb_stream) pgb.refresh() expected_stream.write("\r[" + " " * 3 + "]") self.assertEqual(pgb_stream.getvalue(), expected_stream.getvalue()) def _update_test(self, nbops, expected, size=None): pgb_stream = StringIO() expected_stream = StringIO() if size is None: pgb = ProgressBar(nbops, stream=pgb_stream) size = 20 else: pgb = ProgressBar(nbops, size, stream=pgb_stream) last = 0 for round in expected: if not hasattr(round, "__int__"): dots, update = round else: dots, update = round, None pgb.update() if update or (update is None and dots != last): last = dots expected_stream.write("\r[" + ("=" * dots) + (" " * (size - dots)) + "]") self.assertEqual(pgb_stream.getvalue(), expected_stream.getvalue()) def test_default(self): self._update_test(20, range(1, 21)) def test_nbops_gt_size(self): """Test the progress bar for nbops > size""" def half(total): for counter in range(1, total + 1): yield counter // 2 self._update_test(40, half(40)) def test_nbops_lt_size(self): """Test the progress bar for nbops < size""" def double(total): for counter in range(1, total + 1): yield counter * 2 self._update_test(10, double(10)) def test_nbops_nomul_size(self): """Test the progress bar for size % nbops !=0 (non int number of dots per update)""" self._update_test(3, (6, 13, 20)) def test_overflow(self): self._update_test(5, (8, 16, 25, 33, 42, (42, True)), size=42) def test_update_exact(self): pgb_stream = StringIO() expected_stream = StringIO() size = 20 pgb = ProgressBar(100, size, stream=pgb_stream) for dots in range(10, 105, 15): pgb.update(dots, exact=True) dots //= 5 expected_stream.write("\r[" + ("=" * dots) + (" " * (size - dots)) + "]") self.assertEqual(pgb_stream.getvalue(), expected_stream.getvalue()) def test_update_relative(self): pgb_stream = StringIO() expected_stream = StringIO() size = 20 pgb = ProgressBar(100, size, stream=pgb_stream) for dots in range(5, 105, 5): pgb.update(5, exact=False) dots //= 5 expected_stream.write("\r[" + ("=" * dots) + (" " * (size - dots)) + "]") self.assertEqual(pgb_stream.getvalue(), expected_stream.getvalue()) class RawInputTC(TestCase): def auto_input(self, *args): self.input_args = args return self.input_answer def setUp(self): null_printer = lambda x: None self.qa = RawInput(self.auto_input, null_printer) def test_ask_using_builtin_input(self): with patch("builtins.input", return_value="no"): qa = RawInput() answer = qa.ask("text", ("yes", "no"), "yes") self.assertEqual(answer, "no") def test_ask_default(self): self.input_answer = "" answer = self.qa.ask("text", ("yes", "no"), "yes") self.assertEqual(answer, "yes") self.input_answer = " " answer = self.qa.ask("text", ("yes", "no"), "yes") self.assertEqual(answer, "yes") def test_ask_case(self): self.input_answer = "no" answer = self.qa.ask("text", ("yes", "no"), "yes") self.assertEqual(answer, "no") self.input_answer = "No" answer = self.qa.ask("text", ("yes", "no"), "yes") self.assertEqual(answer, "no") self.input_answer = "NO" answer = self.qa.ask("text", ("yes", "no"), "yes") self.assertEqual(answer, "no") self.input_answer = "nO" answer = self.qa.ask("text", ("yes", "no"), "yes") self.assertEqual(answer, "no") self.input_answer = "YES" answer = self.qa.ask("text", ("yes", "no"), "yes") self.assertEqual(answer, "yes") def test_ask_prompt(self): self.input_answer = "" self.qa.ask("text", ("yes", "no"), "yes") self.assertEqual(self.input_args[0], "text [Y(es)/n(o)]: ") self.qa.ask("text", ("y", "n"), "y") self.assertEqual(self.input_args[0], "text [Y/n]: ") self.qa.ask("text", ("n", "y"), "y") self.assertEqual(self.input_args[0], "text [n/Y]: ") self.qa.ask("text", ("yes", "no", "maybe", "1"), "yes") self.assertEqual(self.input_args[0], "text [Y(es)/n(o)/m(aybe)/1]: ") def test_ask_ambiguous(self): self.input_answer = "y" self.assertRaises(Exception, self.qa.ask, "text", ("yes", "yep"), "yes") def test_confirm(self): self.input_answer = "y" self.assertEqual(self.qa.confirm("Say yes"), True) self.assertEqual(self.qa.confirm("Say yes", default_is_yes=False), True) self.input_answer = "n" self.assertEqual(self.qa.confirm("Say yes"), False) self.assertEqual(self.qa.confirm("Say yes", default_is_yes=False), False) self.input_answer = "" self.assertEqual(self.qa.confirm("Say default"), True) self.assertEqual(self.qa.confirm("Say default", default_is_yes=False), False) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_table.py0000666000000000000000000004272514762603732016740 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """ Unittests for table management """ import sys from io import StringIO from logilab.common.testlib import TestCase, unittest_main from logilab.common.table import ( Table, TableStyleSheet, DocbookTableWriter, DocbookRenderer, TableStyle, TableWriter, TableCellRenderer, ) class TableTC(TestCase): """Table TestCase class""" def setUp(self): """Creates a default table""" # from logilab.common import table # reload(table) self.table = Table() self.table.create_rows(["row1", "row2", "row3"]) self.table.create_columns(["col1", "col2"]) def test_valeur_scalaire(self): tab = Table() tab.create_columns(["col1"]) tab.append_row([1]) self.assertEqual(tab, [[1]]) tab.append_row([2]) self.assertEqual(tab[0, 0], 1) self.assertEqual(tab[1, 0], 2) def test_valeur_ligne(self): tab = Table() tab.create_columns(["col1", "col2"]) tab.append_row([1, 2]) self.assertEqual(tab, [[1, 2]]) def test_valeur_colonne(self): tab = Table() tab.create_columns(["col1"]) tab.append_row([1]) tab.append_row([2]) self.assertEqual(tab, [[1], [2]]) self.assertEqual(tab[:, 0], [1, 2]) def test_indexation(self): """we should be able to use [] to access rows""" self.assertEqual(self.table[0], self.table.data[0]) self.assertEqual(self.table[1], self.table.data[1]) def test_iterable(self): """test iter(table)""" it = iter(self.table) self.assertEqual(next(it), self.table.data[0]) self.assertEqual(next(it), self.table.data[1]) def test_get_rows(self): """tests Table.get_rows()""" self.assertEqual(self.table, [[0, 0], [0, 0], [0, 0]]) self.assertEqual(self.table[:], [[0, 0], [0, 0], [0, 0]]) self.table.insert_column(1, range(3), "supp") self.assertEqual(self.table, [[0, 0, 0], [0, 1, 0], [0, 2, 0]]) self.assertEqual(self.table[:], [[0, 0, 0], [0, 1, 0], [0, 2, 0]]) def test_get_cells(self): self.table.insert_column(1, range(3), "supp") self.assertEqual(self.table[0, 1], 0) self.assertEqual(self.table[1, 1], 1) self.assertEqual(self.table[2, 1], 2) self.assertEqual(self.table["row1", "supp"], 0) self.assertEqual(self.table["row2", "supp"], 1) self.assertEqual(self.table["row3", "supp"], 2) self.assertRaises(KeyError, self.table.__getitem__, ("row1", "foo")) self.assertRaises(KeyError, self.table.__getitem__, ("foo", "bar")) def test_shape(self): """tests table shape""" self.assertEqual(self.table.shape, (3, 2)) self.table.insert_column(1, range(3), "supp") self.assertEqual(self.table.shape, (3, 3)) def test_set_column(self): """Tests that table.set_column() works fine.""" self.table.set_column(0, range(3)) self.assertEqual(self.table[0, 0], 0) self.assertEqual(self.table[1, 0], 1) self.assertEqual(self.table[2, 0], 2) def test_set_column_by_id(self): """Tests that table.set_column_by_id() works fine.""" self.table.set_column_by_id("col1", range(3)) self.assertEqual(self.table[0, 0], 0) self.assertEqual(self.table[1, 0], 1) self.assertEqual(self.table[2, 0], 2) self.assertRaises(KeyError, self.table.set_column_by_id, "col123", range(3)) def test_cells_ids(self): """tests that we can access cells by giving row/col ids""" self.assertRaises(KeyError, self.table.set_cell_by_ids, "row12", "col1", 12) self.assertRaises(KeyError, self.table.set_cell_by_ids, "row1", "col12", 12) self.assertEqual(self.table[0, 0], 0) self.table.set_cell_by_ids("row1", "col1", "DATA") self.assertEqual(self.table[0, 0], "DATA") self.assertRaises(KeyError, self.table.set_row_by_id, "row12", []) self.table.set_row_by_id("row1", ["1.0", "1.1"]) self.assertEqual(self.table[0, 0], "1.0") def test_insert_row(self): """tests a row insertion""" tmp_data = ["tmp1", "tmp2"] self.table.insert_row(1, tmp_data, "tmprow") self.assertEqual(self.table[1], tmp_data) self.assertEqual(self.table["tmprow"], tmp_data) self.table.delete_row_by_id("tmprow") self.assertRaises(KeyError, self.table.delete_row_by_id, "tmprow") self.assertEqual(self.table[1], [0, 0]) self.assertRaises(KeyError, self.table.__getitem__, "tmprow") def test_get_column(self): """Tests that table.get_column() works fine.""" self.table.set_cell(0, 1, 12) self.table.set_cell(2, 1, 13) self.assertEqual(self.table[:, 1], [12, 0, 13]) self.assertEqual(self.table[:, "col2"], [12, 0, 13]) def test_get_columns(self): """Tests if table.get_columns() works fine.""" self.table.set_cell(0, 1, 12) self.table.set_cell(2, 1, 13) self.assertEqual(self.table.get_columns(), [[0, 0, 0], [12, 0, 13]]) def test_insert_column(self): """Tests that table.insert_column() works fine.""" self.table.insert_column(1, range(3), "inserted_column") self.assertEqual(self.table[:, 1], [0, 1, 2]) self.assertEqual(self.table.col_names, ["col1", "inserted_column", "col2"]) def test_delete_column(self): """Tests that table.delete_column() works fine.""" self.table.delete_column(1) self.assertEqual(self.table.col_names, ["col1"]) self.assertEqual(self.table[:, 0], [0, 0, 0]) self.assertRaises(KeyError, self.table.delete_column_by_id, "col2") self.table.delete_column_by_id("col1") self.assertEqual(self.table.col_names, []) def test_transpose(self): """Tests that table.transpose() works fine.""" self.table.append_column(range(5, 8), "col3") ttable = self.table.transpose() self.assertEqual(ttable.row_names, ["col1", "col2", "col3"]) self.assertEqual(ttable.col_names, ["row1", "row2", "row3"]) self.assertEqual(ttable.data, [[0, 0, 0], [0, 0, 0], [5, 6, 7]]) def test_sort_table(self): """Tests the table sort by column""" self.table.set_column(0, [3, 1, 2]) self.table.set_column(1, [1, 2, 3]) self.table.sort_by_column_index(0) self.assertEqual(self.table.row_names, ["row2", "row3", "row1"]) self.assertEqual(self.table.data, [[1, 2], [2, 3], [3, 1]]) self.table.sort_by_column_index(1, "desc") self.assertEqual(self.table.row_names, ["row3", "row2", "row1"]) self.assertEqual(self.table.data, [[2, 3], [1, 2], [3, 1]]) def test_sort_by_id(self): """tests sort_by_column_id()""" self.table.set_column_by_id("col1", [3, 1, 2]) self.table.set_column_by_id("col2", [1, 2, 3]) self.table.sort_by_column_id("col1") self.assertRaises(KeyError, self.table.sort_by_column_id, "col123") self.assertEqual(self.table.row_names, ["row2", "row3", "row1"]) self.assertEqual(self.table.data, [[1, 2], [2, 3], [3, 1]]) self.table.sort_by_column_id("col2", "desc") self.assertEqual(self.table.row_names, ["row3", "row2", "row1"]) self.assertEqual(self.table.data, [[2, 3], [1, 2], [3, 1]]) def test_pprint(self): """only tests pprint doesn't raise an exception""" self.table.pprint() str(self.table) class GroupByTC(TestCase): """specific test suite for groupby()""" def setUp(self): t = Table() t.create_columns(["date", "res", "task", "usage"]) t.append_row(["date1", "ing1", "task1", 0.3]) t.append_row(["date1", "ing2", "task2", 0.3]) t.append_row(["date2", "ing3", "task3", 0.3]) t.append_row(["date3", "ing4", "task2", 0.3]) t.append_row(["date1", "ing1", "task3", 0.3]) t.append_row(["date3", "ing1", "task3", 0.3]) self.table = t def test_single_groupby(self): """tests groupby() on several columns""" grouped = self.table.groupby("date") self.assertEqual(len(grouped), 3) self.assertEqual(len(grouped["date1"]), 3) self.assertEqual(len(grouped["date2"]), 1) self.assertEqual(len(grouped["date3"]), 2) self.assertEqual( grouped["date1"], [ ("date1", "ing1", "task1", 0.3), ("date1", "ing2", "task2", 0.3), ("date1", "ing1", "task3", 0.3), ], ) self.assertEqual(grouped["date2"], [("date2", "ing3", "task3", 0.3)]) self.assertEqual( grouped["date3"], [ ("date3", "ing4", "task2", 0.3), ("date3", "ing1", "task3", 0.3), ], ) def test_multiple_groupby(self): """tests groupby() on several columns""" grouped = self.table.groupby("date", "task") self.assertEqual(len(grouped), 3) self.assertEqual(len(grouped["date1"]), 3) self.assertEqual(len(grouped["date2"]), 1) self.assertEqual(len(grouped["date3"]), 2) self.assertEqual(grouped["date1"]["task1"], [("date1", "ing1", "task1", 0.3)]) self.assertEqual(grouped["date2"]["task3"], [("date2", "ing3", "task3", 0.3)]) self.assertEqual(grouped["date3"]["task2"], [("date3", "ing4", "task2", 0.3)]) date3 = grouped["date3"] self.assertRaises(KeyError, date3.__getitem__, "task1") def test_select(self): """tests Table.select() method""" rows = self.table.select("date", "date1") self.assertEqual( rows, [ ("date1", "ing1", "task1", 0.3), ("date1", "ing2", "task2", 0.3), ("date1", "ing1", "task3", 0.3), ], ) class TableStyleSheetTC(TestCase): """The Stylesheet test case""" def setUp(self): """Builds a simple table to test the stylesheet""" self.table = Table() self.table.create_row("row1") self.table.create_columns(["a", "b", "c"]) self.stylesheet = TableStyleSheet() # We don't want anything to be printed self.stdout_backup = sys.stdout sys.stdout = StringIO() def tearDown(self): sys.stdout = self.stdout_backup def test_add_rule(self): """Tests that the regex pattern works as expected.""" rule = "0_2 = sqrt(0_0**2 + 0_1**2)" self.stylesheet.add_rule(rule) self.table.set_row(0, [3, 4, 0]) self.table.apply_stylesheet(self.stylesheet) self.assertEqual(self.table[0], [3, 4, 5]) self.assertEqual(len(self.stylesheet.rules), 1) self.stylesheet.add_rule("some bad rule with bad syntax") self.assertEqual(len(self.stylesheet.rules), 1, "Ill-formed rule mustn't be added") self.assertEqual(len(self.stylesheet.instructions), 1, "Ill-formed rule mustn't be added") def test_stylesheet_init(self): """tests Stylesheet.__init__""" rule = "0_2 = 1" sheet = TableStyleSheet([rule, "bad rule"]) self.assertEqual(len(sheet.rules), 1, "Ill-formed rule mustn't be added") self.assertEqual(len(sheet.instructions), 1, "Ill-formed rule mustn't be added") def test_rowavg_rule(self): """Tests that add_rowavg_rule works as expected""" self.table.set_row(0, [10, 20, 0]) self.stylesheet.add_rowavg_rule((0, 2), 0, 0, 1) self.table.apply_stylesheet(self.stylesheet) val = self.table[0, 2] self.assertEqual(int(val), 15) def test_rowsum_rule(self): """Tests that add_rowsum_rule works as expected""" self.table.set_row(0, [10, 20, 0]) self.stylesheet.add_rowsum_rule((0, 2), 0, 0, 1) self.table.apply_stylesheet(self.stylesheet) val = self.table[0, 2] self.assertEqual(val, 30) def test_colavg_rule(self): """Tests that add_colavg_rule works as expected""" self.table.set_row(0, [10, 20, 0]) self.table.append_row([12, 8, 3], "row2") self.table.create_row("row3") self.stylesheet.add_colavg_rule((2, 0), 0, 0, 1) self.table.apply_stylesheet(self.stylesheet) val = self.table[2, 0] self.assertEqual(int(val), 11) def test_colsum_rule(self): """Tests that add_colsum_rule works as expected""" self.table.set_row(0, [10, 20, 0]) self.table.append_row([12, 8, 3], "row2") self.table.create_row("row3") self.stylesheet.add_colsum_rule((2, 0), 0, 0, 1) self.table.apply_stylesheet(self.stylesheet) val = self.table[2, 0] self.assertEqual(val, 22) class TableStyleTC(TestCase): """Test suite for TableSuite""" def setUp(self): self.table = Table() self.table.create_rows(["row1", "row2", "row3"]) self.table.create_columns(["col1", "col2"]) self.style = TableStyle(self.table) self._tested_attrs = (("size", "1*"), ("alignment", "right"), ("unit", "")) def test_getset(self): """tests style's get and set methods""" for attrname, default_value in self._tested_attrs: getter = getattr(self.style, f"get_{attrname}") setter = getattr(self.style, f"set_{attrname}") self.assertRaises(KeyError, getter, "badcol") self.assertEqual(getter("col1"), default_value) setter("FOO", "col1") self.assertEqual(getter("col1"), "FOO") def test_getset_index(self): """tests style's get and set by index methods""" for attrname, default_value in self._tested_attrs: getter = getattr(self.style, f"get_{attrname}") getattr(self.style, f"set_{attrname}") igetter = getattr(self.style, f"get_{attrname}_by_index") isetter = getattr(self.style, f"set_{attrname}_by_index") self.assertEqual(getter("__row_column__"), default_value) isetter("FOO", 0) self.assertEqual(getter("__row_column__"), "FOO") self.assertEqual(igetter(0), "FOO") self.assertEqual(getter("col1"), default_value) isetter("FOO", 1) self.assertEqual(getter("col1"), "FOO") self.assertEqual(igetter(1), "FOO") class RendererTC(TestCase): """Test suite for DocbookRenderer""" def setUp(self): self.renderer = DocbookRenderer(alignment=True) self.table = Table() self.table.create_rows(["row1", "row2", "row3"]) self.table.create_columns(["col1", "col2"]) self.style = TableStyle(self.table) self.base_renderer = TableCellRenderer() def test_cell_content(self): """test how alignment is rendered""" entry_xml = self.renderer._render_cell_content("data", self.style, 1) self.assertEqual(entry_xml, "data\n") self.style.set_alignment_by_index("left", 1) entry_xml = self.renderer._render_cell_content("data", self.style, 1) self.assertEqual(entry_xml, "data\n") def test_default_content_rendering(self): """tests that default rendering just prints the cell's content""" rendered_cell = self.base_renderer._render_cell_content("data", self.style, 1) self.assertEqual(rendered_cell, "data") def test_replacement_char(self): """tests that 0 is replaced when asked for""" cell_content = self.base_renderer._make_cell_content(0, self.style, 1) self.assertEqual(cell_content, 0) self.base_renderer.properties["skip_zero"] = "---" cell_content = self.base_renderer._make_cell_content(0, self.style, 1) self.assertEqual(cell_content, "---") def test_unit(self): """tests if units are added""" self.base_renderer.properties["units"] = True self.style.set_unit_by_index("EUR", 1) cell_content = self.base_renderer._make_cell_content(12, self.style, 1) self.assertEqual(cell_content, "12 EUR") class DocbookTableWriterTC(TestCase): """TestCase for table's writer""" def setUp(self): self.stream = StringIO() self.table = Table() self.table.create_rows(["row1", "row2", "row3"]) self.table.create_columns(["col1", "col2"]) self.writer = DocbookTableWriter(self.stream, self.table, None) self.writer.set_renderer(DocbookRenderer()) def test_write_table(self): """make sure write_table() doesn't raise any exception""" self.writer.write_table() def test_abstract_writer(self): """tests that Abstract Writers can't be used !""" writer = TableWriter(self.stream, self.table, None) self.assertRaises(NotImplementedError, writer.write_table) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_taskqueue.py0000666000000000000000000000520514762603732017650 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . from logilab.common.testlib import TestCase, unittest_main from logilab.common.tasksqueue import Task, PrioritizedTasksQueue, LOW, MEDIUM, HIGH class TaskTC(TestCase): def test_eq(self): self.assertNotEqual(Task("t1"), Task("t2")) self.assertEqual(Task("t1"), Task("t1")) def test_cmp(self): self.assertLess(Task("t1", LOW), Task("t2", MEDIUM)) self.assertFalse(Task("t1", LOW) > Task("t2", MEDIUM)) self.assertGreater(Task("t1", HIGH), Task("t2", MEDIUM)) self.assertFalse(Task("t1", HIGH) < Task("t2", MEDIUM)) class PrioritizedTasksQueueTC(TestCase): def test_priority(self): queue = PrioritizedTasksQueue() queue.put(Task("t1")) queue.put(Task("t2", MEDIUM)) queue.put(Task("t3", HIGH)) queue.put(Task("t4", LOW)) self.assertEqual(queue.get().id, "t3") self.assertEqual(queue.get().id, "t2") self.assertEqual(queue.get().id, "t1") self.assertEqual(queue.get().id, "t4") def test_remove_equivalent(self): queue = PrioritizedTasksQueue() queue.put(Task("t1")) queue.put(Task("t2", MEDIUM)) queue.put(Task("t1", HIGH)) queue.put(Task("t3", MEDIUM)) queue.put(Task("t2", MEDIUM)) self.assertEqual(queue.qsize(), 3) self.assertEqual(queue.get().id, "t1") self.assertEqual(queue.get().id, "t2") self.assertEqual(queue.get().id, "t3") self.assertEqual(queue.qsize(), 0) def test_remove(self): queue = PrioritizedTasksQueue() queue.put(Task("t1")) queue.put(Task("t2")) queue.put(Task("t3")) queue.remove("t2") self.assertEqual([t.id for t in queue], ["t3", "t1"]) self.assertRaises(ValueError, queue.remove, "t4") if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_testlib.py0000666000000000000000000006642714762603732017324 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """unittest module for logilab.comon.testlib""" import os import sys from io import StringIO from os.path import join, dirname, isdir, isfile, abspath, exists import tempfile import shutil try: __file__ except NameError: __file__ = sys.argv[0] from logilab.common.testlib import ( TestSuite, unittest_main, Tags, TestCase, mock_object, create_files, InnerTest, tag, require_version, require_module, ) from logilab.common.testlib import SkipAwareTextTestRunner, NonStrictTestLoader class MockTestCase(TestCase): def fail(self, msg): raise AssertionError(msg) class UtilTC(TestCase): def test_mockobject(self): obj = mock_object(foo="bar", baz="bam") self.assertEqual(obj.foo, "bar") self.assertEqual(obj.baz, "bam") def test_create_files(self): chroot = tempfile.mkdtemp() path_to = lambda path: join(chroot, path) dircontent = lambda path: sorted(os.listdir(join(chroot, path))) try: self.assertFalse(isdir(path_to("a/"))) create_files(["a/b/foo.py", "a/b/c/", "a/b/c/d/e.py"], chroot) # make sure directories exist self.assertTrue(isdir(path_to("a"))) self.assertTrue(isdir(path_to("a/b"))) self.assertTrue(isdir(path_to("a/b/c"))) self.assertTrue(isdir(path_to("a/b/c/d"))) # make sure files exist self.assertTrue(isfile(path_to("a/b/foo.py"))) self.assertTrue(isfile(path_to("a/b/c/d/e.py"))) # make sure only asked files were created self.assertEqual(dircontent("a"), ["b"]) self.assertEqual(dircontent("a/b"), ["c", "foo.py"]) self.assertEqual(dircontent("a/b/c"), ["d"]) self.assertEqual(dircontent("a/b/c/d"), ["e.py"]) finally: shutil.rmtree(chroot) class TestlibTC(TestCase): def mkdir(self, path): if not exists(path): self._dirs.add(path) os.mkdir(path) def setUp(self): self.tc = MockTestCase() self._dirs = set() def tearDown(self): while self._dirs: shutil.rmtree(self._dirs.pop(), ignore_errors=True) def test_dict_equals(self): """tests TestCase.assertDictEqual""" d1 = {"a": 1, "b": 2} d2 = {"a": 1, "b": 3} d3 = dict(d1) self.assertRaises(AssertionError, self.tc.assertDictEqual, d1, d2) self.tc.assertDictEqual(d1, d3) self.tc.assertDictEqual(d3, d1) self.tc.assertDictEqual(d1, d1) def test_list_equals(self): """tests TestCase.assertListEqual""" l1 = list(range(10)) l2 = list(range(5)) l3 = list(range(10)) self.assertRaises(AssertionError, self.tc.assertListEqual, l1, l2) self.tc.assertListEqual(l1, l1) self.tc.assertListEqual(l1, l3) self.tc.assertListEqual(l3, l1) def test_equality_for_sets(self): s1 = set("ab") s2 = set("a") self.assertRaises(AssertionError, self.tc.assertSetEqual, s1, s2) self.tc.assertSetEqual(s1, s1) self.tc.assertSetEqual(set(), set()) def test_text_equality(self): self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, "toto", 12) self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, "toto", 12) self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, "toto", None) self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, "toto", None) self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, 3.12, "toto") self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, 3.12, "toto") self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, None, "toto") self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, None, "toto") self.tc.assertMultiLineEqual("toto\ntiti", "toto\ntiti") self.tc.assertMultiLineEqual("toto\ntiti", "toto\ntiti") self.assertRaises( AssertionError, self.tc.assertMultiLineEqual, "toto\ntiti", "toto\n titi\n" ) self.assertRaises( AssertionError, self.tc.assertMultiLineEqual, "toto\ntiti", "toto\n titi\n" ) foo = join(dirname(__file__), "data", "foo.txt") spam = join(dirname(__file__), "data", "spam.txt") with open(foo) as fobj: text1 = fobj.read() self.tc.assertMultiLineEqual(text1, text1) self.tc.assertMultiLineEqual(text1, text1) with open(spam) as fobj: text2 = fobj.read() self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, text1, text2) self.assertRaises(AssertionError, self.tc.assertMultiLineEqual, text1, text2) def test_default_datadir(self): expected_datadir = join(dirname(abspath(__file__)), "data") self.assertEqual(self.datadir, expected_datadir) self.assertEqual(self.datapath("foo"), join(expected_datadir, "foo")) def test_multiple_args_datadir(self): expected_datadir = join(dirname(abspath(__file__)), "data") self.assertEqual(self.datadir, expected_datadir) self.assertEqual(self.datapath("foo", "bar"), join(expected_datadir, "foo", "bar")) def test_custom_datadir(self): class MyTC(TestCase): datadir = "foo" def test_1(self): pass # class' custom datadir tc = MyTC("test_1") self.assertEqual(tc.datapath("bar"), join("foo", "bar")) def test_cached_datadir(self): """test datadir is cached on the class""" class MyTC(TestCase): def test_1(self): pass expected_datadir = join(dirname(abspath(__file__)), "data") tc = MyTC("test_1") self.assertEqual(tc.datadir, expected_datadir) # changing module should not change the datadir MyTC.__module__ = "os" self.assertEqual(tc.datadir, expected_datadir) # even on new instances tc2 = MyTC("test_1") self.assertEqual(tc2.datadir, expected_datadir) def test_is(self): obj_1 = [] obj_2 = [] self.assertIs(obj_1, obj_1) self.assertRaises(AssertionError, self.assertIs, obj_1, obj_2) def test_isnot(self): obj_1 = [] obj_2 = [] self.assertIsNot(obj_1, obj_2) self.assertRaises(AssertionError, self.assertIsNot, obj_1, obj_1) def test_none(self): self.assertIsNone(None) self.assertRaises(AssertionError, self.assertIsNone, object()) def test_not_none(self): self.assertIsNotNone(object()) self.assertRaises(AssertionError, self.assertIsNotNone, None) def test_in(self): self.assertIn("a", "dsqgaqg") obj, seq = "a", ("toto", "azf", "coin") self.assertRaises(AssertionError, self.assertIn, obj, seq) def test_not_in(self): self.assertNotIn("a", ("toto", "azf", "coin")) self.assertRaises(AssertionError, self.assertNotIn, "a", "dsqgaqg") class GenerativeTestsTC(TestCase): def setUp(self): output = StringIO() self.runner = SkipAwareTextTestRunner(stream=output) def test_generative_ok(self): class FooTC(TestCase): def test_generative(self): for i in range(10): yield self.assertEqual, i, i result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 10) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 0) def test_generative_half_bad(self): class FooTC(TestCase): def test_generative(self): for i in range(10): yield self.assertEqual, i % 2, 0 result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 10) self.assertEqual(len(result.failures), 5) self.assertEqual(len(result.errors), 0) def test_generative_error(self): class FooTC(TestCase): def test_generative(self): for i in range(10): if i == 5: raise ValueError("STOP !") yield self.assertEqual, i, i result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 5) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 1) def test_generative_error2(self): class FooTC(TestCase): def test_generative(self): for i in range(10): if i == 5: yield self.ouch yield self.assertEqual, i, i def ouch(self): raise ValueError("stop !") result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 11) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 1) def test_generative_setup(self): class FooTC(TestCase): def setUp(self): raise ValueError("STOP !") def test_generative(self): for i in range(10): yield self.assertEqual, i, i result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 1) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 1) def test_generative_inner_skip(self): class FooTC(TestCase): def check(self, val): if val == 5: self.innerSkip("no 5") else: self.assertEqual(val, val) def test_generative(self): for i in range(10): yield InnerTest(f"check_{i}", self.check, i) result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 10) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 0) self.assertEqual(len(result.skipped), 1) def test_generative_skip(self): class FooTC(TestCase): def check(self, val): if val == 5: self.skipTest("no 5") else: self.assertEqual(val, val) def test_generative(self): for i in range(10): yield InnerTest(f"check_{i}", self.check, i) result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 10) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 0) self.assertEqual(len(result.skipped), 1) def test_generative_inner_error(self): class FooTC(TestCase): def check(self, val): if val == 5: raise ValueError("no 5") else: self.assertEqual(val, val) def test_generative(self): for i in range(10): yield InnerTest(f"check_{i}", self.check, i) result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 10) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 1) self.assertEqual(len(result.skipped), 0) def test_generative_inner_failure(self): class FooTC(TestCase): def check(self, val): if val == 5: self.assertEqual(val, val + 1) else: self.assertEqual(val, val) def test_generative(self): for i in range(10): yield InnerTest(f"check_{i}", self.check, i) result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 10) self.assertEqual(len(result.failures), 1) self.assertEqual(len(result.errors), 0) self.assertEqual(len(result.skipped), 0) def test_generative_outer_failure(self): class FooTC(TestCase): def test_generative(self): self.fail() yield result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 0) self.assertEqual(len(result.failures), 1) self.assertEqual(len(result.errors), 0) self.assertEqual(len(result.skipped), 0) def test_generative_outer_skip(self): class FooTC(TestCase): def test_generative(self): self.skipTest("blah") yield result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 0) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 0) self.assertEqual(len(result.skipped), 1) class ExitFirstTC(TestCase): def setUp(self): output = StringIO() self.runner = SkipAwareTextTestRunner(stream=output, exitfirst=True) def test_failure_exit_first(self): class FooTC(TestCase): def test_1(self): pass def test_2(self): assert False def test_3(self): pass tests = [FooTC("test_1"), FooTC("test_2")] result = self.runner.run(TestSuite(tests)) self.assertEqual(result.testsRun, 2) self.assertEqual(len(result.failures), 1) self.assertEqual(len(result.errors), 0) def test_error_exit_first(self): class FooTC(TestCase): def test_1(self): pass def test_2(self): raise ValueError() def test_3(self): pass tests = [FooTC("test_1"), FooTC("test_2"), FooTC("test_3")] result = self.runner.run(TestSuite(tests)) self.assertEqual(result.testsRun, 2) self.assertEqual(len(result.failures), 0) self.assertEqual(len(result.errors), 1) def test_generative_exit_first(self): class FooTC(TestCase): def test_generative(self): for i in range(10): yield self.assertTrue, False result = self.runner.run(FooTC("test_generative")) self.assertEqual(result.testsRun, 1) self.assertEqual(len(result.failures), 1) self.assertEqual(len(result.errors), 0) class TestLoaderTC(TestCase): # internal classes for test purposes ######## class FooTC(TestCase): def test_foo1(self): pass def test_foo2(self): pass def test_bar1(self): pass class BarTC(TestCase): def test_bar2(self): pass ############################################## def setUp(self): self.loader = NonStrictTestLoader() self.module = ( TestLoaderTC # mock_object(FooTC=TestLoaderTC.FooTC, BarTC=TestLoaderTC.BarTC) ) self.output = StringIO() self.runner = SkipAwareTextTestRunner(stream=self.output) def assertRunCount(self, pattern, module, expected_count, skipped=()): self.loader.test_pattern = pattern self.loader.skipped_patterns = skipped if pattern: suite = self.loader.loadTestsFromNames([pattern], module) else: suite = self.loader.loadTestsFromModule(module) result = self.runner.run(suite) self.loader.test_pattern = None self.loader.skipped_patterns = () self.assertEqual(result.testsRun, expected_count) def test_collect_everything(self): """make sure we don't change the default behaviour for loadTestsFromModule() and loadTestsFromTestCase """ testsuite = self.loader.loadTestsFromModule(self.module) self.assertEqual(len(testsuite._tests), 2) suite1, suite2 = testsuite._tests self.assertEqual(len(suite1._tests) + len(suite2._tests), 4) def test_collect_with_classname(self): self.assertRunCount("FooTC", self.module, 3) self.assertRunCount("BarTC", self.module, 1) def test_collect_with_classname_and_pattern(self): data = [ ("FooTC.test_foo1", 1), ("FooTC.test_foo", 2), ("FooTC.test_fo", 2), ("FooTC.foo1", 1), ("FooTC.foo", 2), ("FooTC.whatever", 0), ] for pattern, expected_count in data: self.assertRunCount(pattern, self.module, expected_count) def test_collect_with_pattern(self): data = [ ("test_foo1", 1), ("test_foo", 2), ("test_bar", 2), ("foo1", 1), ("foo", 2), ("bar", 2), ("ba", 2), ("test", 4), ("ab", 0), ] for pattern, expected_count in data: self.assertRunCount(pattern, self.module, expected_count) def test_testcase_with_custom_metaclass(self): class mymetaclass(type): pass class MyMod: class MyTestCase(TestCase): __metaclass__ = mymetaclass def test_foo1(self): pass def test_foo2(self): pass def test_bar(self): pass data = [ ("test_foo1", 1), ("test_foo", 2), ("test_bar", 1), ("foo1", 1), ("foo", 2), ("bar", 1), ("ba", 1), ("test", 3), ("ab", 0), ("MyTestCase.test_foo1", 1), ("MyTestCase.test_foo", 2), ("MyTestCase.test_fo", 2), ("MyTestCase.foo1", 1), ("MyTestCase.foo", 2), ("MyTestCase.whatever", 0), ] for pattern, expected_count in data: self.assertRunCount(pattern, MyMod, expected_count) def test_collect_everything_and_skipped_patterns(self): testdata = [ (["foo1"], 3), (["foo"], 2), (["foo", "bar"], 0), ] for skipped, expected_count in testdata: self.assertRunCount(None, self.module, expected_count, skipped) def test_collect_specific_pattern_and_skip_some(self): testdata = [ ("bar", ["foo1"], 2), ("bar", [], 2), ("bar", ["bar"], 0), ] for runpattern, skipped, expected_count in testdata: self.assertRunCount(runpattern, self.module, expected_count, skipped) def test_skip_classname(self): testdata = [ (["BarTC"], 3), (["FooTC"], 1), ] for skipped, expected_count in testdata: self.assertRunCount(None, self.module, expected_count, skipped) def test_skip_classname_and_specific_collect(self): testdata = [ ("bar", ["BarTC"], 1), ("foo", ["FooTC"], 0), ] for runpattern, skipped, expected_count in testdata: self.assertRunCount(runpattern, self.module, expected_count, skipped) def test_nonregr_dotted_path(self): self.assertRunCount("FooTC.test_foo", self.module, 2) def test_inner_tests_selection(self): class MyMod: class MyTestCase(TestCase): def test_foo(self): pass def test_foobar(self): for i in range(5): if i % 2 == 0: yield InnerTest("even", lambda: None) else: yield InnerTest("odd", lambda: None) yield lambda: None # FIXME InnerTest masked by pattern usage # data = [('foo', 7), ('test_foobar', 6), ('even', 3), ('odd', 2), ] data = [ ("foo", 7), ("test_foobar", 6), ("even", 0), ("odd", 0), ] for pattern, expected_count in data: self.assertRunCount(pattern, MyMod, expected_count) def test_nonregr_class_skipped_option(self): class MyMod: class MyTestCase(TestCase): def test_foo(self): pass def test_bar(self): pass class FooTC(TestCase): def test_foo(self): pass self.assertRunCount("foo", MyMod, 2) self.assertRunCount(None, MyMod, 3) self.assertRunCount("foo", MyMod, 1, ["FooTC"]) self.assertRunCount(None, MyMod, 2, ["FooTC"]) def test__classes_are_ignored(self): class MyMod: class _Base(TestCase): def test_1(self): pass class MyTestCase(_Base): def test_2(self): pass self.assertRunCount(None, MyMod, 2) class DecoratorTC(TestCase): def setUp(self): self.pyversion = sys.version_info def tearDown(self): sys.version_info = self.pyversion def test_require_version_good(self): """should return the same function""" def func(): pass sys.version_info = (2, 5, 5, "final", 4) current = sys.version_info[:3] compare = ("2.4", "2.5", "2.5.4", "2.5.5") for version in compare: decorator = require_version(version) self.assertEqual( func, decorator(func), "%s =< %s : function \ return by the decorator should be the same." % (version, ".".join([str(element) for element in current])), ) def test_require_version_bad(self): """should return a different function : skipping test""" def func(): pass sys.version_info = (2, 5, 5, "final", 4) current = sys.version_info[:3] compare = ("2.5.6", "2.6", "2.6.5") for version in compare: decorator = require_version(version) self.assertNotEqual( func, decorator(func), "%s >= %s : function \ return by the decorator should NOT be the same." % (".".join([str(element) for element in current]), version), ) def test_require_version_exception(self): """should throw a ValueError exception""" def func(): pass compare = ("2.5.a", "2.a", "azerty") for version in compare: decorator = require_version(version) self.assertRaises(ValueError, decorator, func) def test_require_module_good(self): """should return the same function""" def func(): pass module = "sys" decorator = require_module(module) self.assertEqual( func, decorator(func), "module %s exists : function \ return by the decorator should be the same." % module, ) def test_require_module_bad(self): """should return a different function : skipping test""" def func(): pass modules = ("bla", "blo", "bli") for module in modules: try: __import__(module) except ImportError: decorator = require_module(module) self.assertNotEqual( func, decorator(func), "module %s does \ not exist : function return by the decorator should \ NOT be the same." % module, ) return print( "all modules in %s exist. Could not test %s" % (", ".join(modules), sys._getframe().f_code.co_name) ) class TagTC(TestCase): def setUp(self): @tag("testing", "bob") def bob(a, b, c): return (a + b) * c self.func = bob class TagTestTC(TestCase): tags = Tags("one", "two") def test_one(self): self.assertTrue(True) @tag("two", "three") def test_two(self): self.assertTrue(True) @tag("three", inherit=False) def test_three(self): self.assertTrue(True) self.cls = TagTestTC def test_tag_decorator(self): bob = self.func self.assertEqual(bob(2, 3, 7), 35) self.assertTrue(hasattr(bob, "tags")) self.assertSetEqual(bob.tags, {"testing", "bob"}) def test_tags_class(self): tags = self.func.tags self.assertTrue(tags["testing"]) self.assertFalse(tags["Not inside"]) def test_tags_match(self): tags = self.func.tags self.assertTrue(tags.match("testing")) self.assertFalse(tags.match("other")) self.assertFalse(tags.match("testing and coin")) self.assertTrue(tags.match("testing or other")) self.assertTrue(tags.match("not other")) self.assertTrue(tags.match("not other or (testing and bibi)")) self.assertTrue(tags.match("other or (testing and bob)")) def test_tagged_class(self): def options(tags): class Options: tags_pattern = tags return Options() tc = self.cls("test_one") runner = SkipAwareTextTestRunner() self.assertTrue(runner.does_match_tags(tc.test_one)) self.assertTrue(runner.does_match_tags(tc.test_two)) self.assertTrue(runner.does_match_tags(tc.test_three)) runner = SkipAwareTextTestRunner(options=options("one")) self.assertTrue(runner.does_match_tags(tc.test_one)) self.assertTrue(runner.does_match_tags(tc.test_two)) self.assertFalse(runner.does_match_tags(tc.test_three)) runner = SkipAwareTextTestRunner(options=options("two")) self.assertTrue(runner.does_match_tags(tc.test_one)) self.assertTrue(runner.does_match_tags(tc.test_two)) self.assertFalse(runner.does_match_tags(tc.test_three)) runner = SkipAwareTextTestRunner(options=options("three")) self.assertFalse(runner.does_match_tags(tc.test_one)) self.assertTrue(runner.does_match_tags(tc.test_two)) self.assertTrue(runner.does_match_tags(tc.test_three)) runner = SkipAwareTextTestRunner(options=options("two or three")) self.assertTrue(runner.does_match_tags(tc.test_one)) self.assertTrue(runner.does_match_tags(tc.test_two)) self.assertTrue(runner.does_match_tags(tc.test_three)) runner = SkipAwareTextTestRunner(options=options("two and three")) self.assertFalse(runner.does_match_tags(tc.test_one)) self.assertTrue(runner.does_match_tags(tc.test_two)) self.assertFalse(runner.does_match_tags(tc.test_three)) if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_textutils.py0000666000000000000000000002714214762603732017712 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """ unit tests for module textutils squeleton generated by /home/syt/cvs_work/logilab/pyreverse/py2tests.py on Sep 08 at 09:1:31 """ # flake8: noqa: E501 import doctest import re from os import linesep from logilab.common import textutils as tu from logilab.common.testlib import TestCase, unittest_main if linesep != "\n": LINE_RGX = re.compile(linesep) def ulines(string): return LINE_RGX.sub("\n", string) else: def ulines(string): return string class NormalizeTextTC(TestCase): def test_known_values(self): self.assertEqual( ulines( tu.normalize_text( """some really malformated text. With some times some veeeeeeeeeeeeeeerrrrryyyyyyyyyyyyyyyyyyy loooooooooooooooooooooong linnnnnnnnnnnes and empty lines! """ ) ), """some really malformated text. With some times some veeeeeeeeeeeeeeerrrrryyyyyyyyyyyyyyyyyyy loooooooooooooooooooooong linnnnnnnnnnnes and empty lines!""", ) self.assertMultiLineEqual( ulines( tu.normalize_text( """\ some ReST formated text ======================= With some times some veeeeeeeeeeeeeeerrrrryyyyyyyyyyyyyyyyyyy loooooooooooooooooooooong linnnnnnnnnnnes and normal lines! another paragraph """, rest=True, ) ), """\ some ReST formated text ======================= With some times some veeeeeeeeeeeeeeerrrrryyyyyyyyyyyyyyyyyyy loooooooooooooooooooooong linnnnnnnnnnnes and normal lines! another paragraph""", ) def test_nonregr_unsplitable_word(self): self.assertEqual( ulines( tu.normalize_text( """petit complement : http://www.plonefr.net/blog/archive/2005/10/30/tester-la-future-infrastructure-i18n """, 80, ) ), """petit complement : http://www.plonefr.net/blog/archive/2005/10/30/tester-la-future-infrastructure-i18n""", ) def test_nonregr_rest_normalize(self): self.assertEqual( ulines( tu.normalize_text( """... Il est donc evident que tout le monde doit lire le compte-rendu de RSH et aller discuter avec les autres si c'est utile ou necessaire. """, rest=True, ) ), """... Il est donc evident que tout le monde doit lire le compte-rendu de RSH et aller discuter avec les autres si c'est utile ou necessaire.""", ) def test_normalize_rest_paragraph(self): self.assertEqual( ulines(tu.normalize_rest_paragraph("""**nico**: toto""")), """**nico**: toto""" ) def test_normalize_rest_paragraph2(self): self.assertEqual( ulines( tu.normalize_rest_paragraph( """.. _tdm: http://www.editions-eni.fr/Livres/Python-Les-fondamentaux-du-langage---La-programmation-pour-les-scientifiques-Table-des-matieres/.20_adaa41fb-c125-4919-aece-049601e81c8e_0_0.pdf .. _extrait: http://www.editions-eni.fr/Livres/Python-Les-fondamentaux-du-langage---La-programmation-pour-les-scientifiques-Extrait-du-livre/.20_d6eed0be-0d36-4384-be59-2dd09e081012_0_0.pdf""", indent="> ", ) ), """> .. _tdm: > http://www.editions-eni.fr/Livres/Python-Les-fondamentaux-du-langage---La-programmation-pour-les-scientifiques-Table-des-matieres/.20_adaa41fb-c125-4919-aece-049601e81c8e_0_0.pdf > .. _extrait: > http://www.editions-eni.fr/Livres/Python-Les-fondamentaux-du-langage---La-programmation-pour-les-scientifiques-Extrait-du-livre/.20_d6eed0be-0d36-4384-be59-2dd09e081012_0_0.pdf""", ) def test_normalize_paragraph2(self): self.assertEqual( ulines( tu.normalize_paragraph( """.. _tdm: http://www.editions-eni.fr/Livres/Python-Les-fondamentaux-du-langage---La-programmation-pour-les-scientifiques-Table-des-matieres/.20_adaa41fb-c125-4919-aece-049601e81c8e_0_0.pdf .. _extrait: http://www.editions-eni.fr/Livres/Python-Les-fondamentaux-du-langage---La-programmation-pour-les-scientifiques-Extrait-du-livre/.20_d6eed0be-0d36-4384-be59-2dd09e081012_0_0.pdf""", indent="> ", ) ), """> .. _tdm: > http://www.editions-eni.fr/Livres/Python-Les-fondamentaux-du-langage---La-programmation-pour-les-scientifiques-Table-des-matieres/.20_adaa41fb-c125-4919-aece-049601e81c8e_0_0.pdf > .. _extrait: > http://www.editions-eni.fr/Livres/Python-Les-fondamentaux-du-langage---La-programmation-pour-les-scientifiques-Extrait-du-livre/.20_d6eed0be-0d36-4384-be59-2dd09e081012_0_0.pdf""", ) class NormalizeParagraphTC(TestCase): def test_known_values(self): self.assertEqual( ulines( tu.normalize_text( """This package contains test files shared by the logilab-common package. It isn't necessary to install this package unless you want to execute or look at the tests.""", indent=" ", line_len=70, ) ), """\ This package contains test files shared by the logilab-common package. It isn't necessary to install this package unless you want to execute or look at the tests.""", ) class GetCsvTC(TestCase): def test_known(self): self.assertEqual(tu.splitstrip("a, b,c "), ["a", "b", "c"]) class UnitsTC(TestCase): def setUp(self): self.units = { "m": 60, "kb": 1024, "mb": 1024 * 1024, } def test_empty_base(self): self.assertEqual(tu.apply_units("17", {}), 17) def test_empty_inter(self): def inter(value): return int(float(value)) * 2 result = tu.apply_units("12.4", {}, inter=inter) self.assertEqual(result, 12 * 2) self.assertIsInstance(result, float) def test_empty_final(self): # int('12.4') raise value error self.assertRaises(ValueError, tu.apply_units, "12.4", {}, final=int) def test_empty_inter_final(self): result = tu.apply_units("12.4", {}, inter=float, final=int) self.assertEqual(result, 12) self.assertIsInstance(result, int) def test_blank_base(self): result = tu.apply_units(" 42 ", {}, final=int) self.assertEqual(result, 42) def test_blank_space(self): result = tu.apply_units(" 1 337 ", {}, final=int) self.assertEqual(result, 1337) def test_blank_coma(self): result = tu.apply_units(" 4,298.42 ", {}) self.assertEqual(result, 4298.42) def test_blank_mixed(self): result = tu.apply_units("45, 317, 337", {}, final=int) self.assertEqual(result, 45317337) def test_unit_singleunit_singleletter(self): result = tu.apply_units("15m", self.units) self.assertEqual(result, 15 * self.units["m"]) def test_unit_singleunit_multipleletter(self): result = tu.apply_units("47KB", self.units) self.assertEqual(result, 47 * self.units["kb"]) def test_unit_singleunit_caseinsensitive(self): result = tu.apply_units("47kb", self.units) self.assertEqual(result, 47 * self.units["kb"]) def test_unit_multipleunit(self): result = tu.apply_units("47KB 1.5MB", self.units) self.assertEqual(result, 47 * self.units["kb"] + 1.5 * self.units["mb"]) def test_unit_with_blank(self): result = tu.apply_units("1 000 KB", self.units) self.assertEqual(result, 1000 * self.units["kb"]) def test_unit_wrong_input(self): self.assertRaises(ValueError, tu.apply_units, "", self.units) self.assertRaises(ValueError, tu.apply_units, "wrong input", self.units) self.assertRaises(ValueError, tu.apply_units, "wrong13 input", self.units) self.assertRaises(ValueError, tu.apply_units, "wrong input42", self.units) with self.assertRaises(ValueError) as cm: tu.apply_units("42 cakes", self.units) self.assertIn("invalid unit cakes.", str(cm.exception)) RGX = re.compile("abcd") class PrettyMatchTC(TestCase): def test_known(self): string = "hiuherabcdef" self.assertEqual( ulines(tu.pretty_match(RGX.search(string), string)), "hiuherabcdef\n ^^^^" ) def test_known_values_1(self): rgx = re.compile("(to*)") string = "toto" match = rgx.search(string) self.assertEqual( ulines(tu.pretty_match(match, string)), """toto ^^""", ) def test_known_values_2(self): rgx = re.compile("(to*)") string = """ ... ... to to ... ... """ match = rgx.search(string) self.assertEqual( ulines(tu.pretty_match(match, string)), """ ... ... to to ^^ ... ...""", ) class UnquoteTC(TestCase): def test(self): self.assertEqual(tu.unquote('"toto"'), "toto") self.assertEqual(tu.unquote("'l'inenarrable toto'"), "l'inenarrable toto") self.assertEqual(tu.unquote("no quote"), "no quote") class ColorizeAnsiTC(TestCase): def test_known(self): self.assertEqual(tu.colorize_ansi("hello", "blue", "strike"), "\x1b[9;34mhello\x1b[0m") self.assertEqual( tu.colorize_ansi("hello", style="strike, inverse"), "\x1b[9;7mhello\x1b[0m" ) self.assertEqual(tu.colorize_ansi("hello", None, None), "hello") self.assertEqual(tu.colorize_ansi("hello", "", ""), "hello") def test_raise(self): self.assertRaises(KeyError, tu.colorize_ansi, "hello", "bleu", None) self.assertRaises(KeyError, tu.colorize_ansi, "hello", None, "italique") class UnormalizeTC(TestCase): def test_unormalize_no_substitute(self): data = [ ("\u0153nologie", "oenologie"), ("\u0152nologie", "OEnologie"), ("l\xf8to", "loto"), ("été", "ete"), ("àèùéïîôêç", "aeueiioec"), ("ÀÈÙÉÏÎÔÊÇ", "AEUEIIOEC"), ("\xa0", " "), # NO-BREAK SPACE managed by NFKD decomposition ("\u0154", "R"), ("Pointe d\u2019Yves", "Pointe d'Yves"), ("Bordeaux\u2013Mérignac", "Bordeaux-Merignac"), ] for input, output in data: self.assertEqual(tu.unormalize(input), output) def test_unormalize_substitute(self): self.assertEqual(tu.unormalize("ab \u8000 cd", substitute="_"), "ab _ cd") def test_unormalize_backward_compat(self): self.assertRaises(ValueError, tu.unormalize, "\u8000") self.assertEqual(tu.unormalize("\u8000", substitute=""), "") def load_tests(loader, tests, ignore): tests.addTests(doctest.DocTestSuite(tu)) return tests if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_tree.py0000666000000000000000000002237114762603732016603 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """ unit tests for module logilab.common.tree squeleton generated by /home/syt/bin/py2tests on Jan 20 at 10:43:25 """ from logilab.common.testlib import TestCase, unittest_main from logilab.common.tree import ( Node, NodeNotFound, post_order_list, PostfixedDepthFirstIterator, pre_order_list, PrefixedDepthFirstIterator, ) tree = ( "root", ( ("child_1_1", (("child_2_1", ()), ("child_2_2", (("child_3_1", ()),)))), ("child_1_2", (("child_2_3", ()),)), ), ) def make_tree(tuple): n = Node(tuple[0]) for child in tuple[1]: n.append(make_tree(child)) return n class Node_ClassTest(TestCase): """a basic tree node, caracterised by an id""" def setUp(self): """called before each test from this class""" self.o = make_tree(tree) def test_flatten(self): result = [r.id for r in self.o.flatten()] expected = [ "root", "child_1_1", "child_2_1", "child_2_2", "child_3_1", "child_1_2", "child_2_3", ] self.assertListEqual(result, expected) def test_flatten_with_outlist(self): resultnodes = [] self.o.flatten(resultnodes) result = [r.id for r in resultnodes] expected = [ "root", "child_1_1", "child_2_1", "child_2_2", "child_3_1", "child_1_2", "child_2_3", ] self.assertListEqual(result, expected) def test_known_values_remove(self): """ remove a child node """ self.o.remove(self.o.get_node_by_id("child_1_1")) self.assertRaises(NodeNotFound, self.o.get_node_by_id, "child_1_1") def test_known_values_replace(self): """ replace a child node with another """ self.o.replace(self.o.get_node_by_id("child_1_1"), Node("hoho")) self.assertRaises(NodeNotFound, self.o.get_node_by_id, "child_1_1") self.assertEqual(self.o.get_node_by_id("hoho"), self.o.children[0]) def test_known_values_get_sibling(self): """ return the sibling node that has given id """ self.assertEqual(self.o.children[0].get_sibling("child_1_2"), self.o.children[1], None) def test_raise_get_sibling_NodeNotFound(self): self.assertRaises(NodeNotFound, self.o.children[0].get_sibling, "houhou") def test_known_values_get_node_by_id(self): """ return node in whole hierarchy that has given id """ self.assertEqual(self.o.get_node_by_id("child_1_1"), self.o.children[0]) def test_raise_get_node_by_id_NodeNotFound(self): self.assertRaises(NodeNotFound, self.o.get_node_by_id, "houhou") def test_known_values_get_child_by_id(self): """ return child of given id """ self.assertEqual( self.o.get_child_by_id("child_2_1", recurse=1), self.o.children[0].children[0] ) def test_raise_get_child_by_id_NodeNotFound(self): self.assertRaises(NodeNotFound, self.o.get_child_by_id, nid="child_2_1") self.assertRaises(NodeNotFound, self.o.get_child_by_id, "houhou") def test_known_values_get_child_by_path(self): """ return child of given path (path is a list of ids) """ self.assertEqual( self.o.get_child_by_path(["root", "child_1_1", "child_2_1"]), self.o.children[0].children[0], ) def test_raise_get_child_by_path_NodeNotFound(self): self.assertRaises(NodeNotFound, self.o.get_child_by_path, ["child_1_1", "child_2_11"]) def test_known_values_depth_down(self): """ return depth of this node in the tree """ self.assertEqual(self.o.depth_down(), 4) self.assertEqual(self.o.get_child_by_id("child_2_1", True).depth_down(), 1) def test_known_values_depth(self): """ return depth of this node in the tree """ self.assertEqual(self.o.depth(), 0) self.assertEqual(self.o.get_child_by_id("child_2_1", True).depth(), 2) def test_known_values_width(self): """ return depth of this node in the tree """ self.assertEqual(self.o.width(), 3) self.assertEqual(self.o.get_child_by_id("child_2_1", True).width(), 1) def test_known_values_root(self): """ return the root node of the tree """ self.assertEqual(self.o.get_child_by_id("child_2_1", True).root(), self.o) def test_known_values_leaves(self): """ return a list with all the leaf nodes descendant from this task """ self.assertEqual( self.o.leaves(), [ self.o.get_child_by_id("child_2_1", True), self.o.get_child_by_id("child_3_1", True), self.o.get_child_by_id("child_2_3", True), ], ) def test_known_values_lineage(self): c31 = self.o.get_child_by_id("child_3_1", True) self.assertEqual( c31.lineage(), [ self.o.get_child_by_id("child_3_1", True), self.o.get_child_by_id("child_2_2", True), self.o.get_child_by_id("child_1_1", True), self.o, ], ) class post_order_list_FunctionTest(TestCase): def setUp(self): """called before each test from this class""" self.o = make_tree(tree) def test_known_values_post_order_list(self): """ create a list with tree nodes for which the function returned true in a post order foashion """ L = [ "child_2_1", "child_3_1", "child_2_2", "child_1_1", "child_2_3", "child_1_2", "root", ] li = [n.id for n in post_order_list(self.o)] self.assertEqual(li, L, li) def test_known_values_post_order_list2(self): """ create a list with tree nodes for which the function returned true in a post order foashion """ def filter(node): if node.id == "child_2_2": return 0 return 1 L = ["child_2_1", "child_1_1", "child_2_3", "child_1_2", "root"] li = [n.id for n in post_order_list(self.o, filter)] self.assertEqual(li, L, li) class PostfixedDepthFirstIterator_ClassTest(TestCase): def setUp(self): """called before each test from this class""" self.o = make_tree(tree) def test_known_values_next(self): L = ["child_2_1", "child_3_1", "child_2_2", "child_1_1", "child_2_3", "child_1_2", "root"] iter = PostfixedDepthFirstIterator(self.o) o = next(iter) i = 0 while o: self.assertEqual(o.id, L[i]) o = next(iter) i += 1 class pre_order_list_FunctionTest(TestCase): def setUp(self): """called before each test from this class""" self.o = make_tree(tree) def test_known_values_pre_order_list(self): """ create a list with tree nodes for which the function returned true in a pre order fashion """ L = [ "root", "child_1_1", "child_2_1", "child_2_2", "child_3_1", "child_1_2", "child_2_3", ] li = [n.id for n in pre_order_list(self.o)] self.assertEqual(li, L, li) def test_known_values_pre_order_list2(self): """ create a list with tree nodes for which the function returned true in a pre order fashion """ def filter(node): if node.id == "child_2_2": return 0 return 1 L = ["root", "child_1_1", "child_2_1", "child_1_2", "child_2_3"] li = [n.id for n in pre_order_list(self.o, filter)] self.assertEqual(li, L, li) class PrefixedDepthFirstIterator_ClassTest(TestCase): def setUp(self): """called before each test from this class""" self.o = make_tree(tree) def test_known_values_next(self): L = ["root", "child_1_1", "child_2_1", "child_2_2", "child_3_1", "child_1_2", "child_2_3"] iter = PrefixedDepthFirstIterator(self.o) o = next(iter) i = 0 while o: self.assertEqual(o.id, L[i]) o = next(iter) i += 1 if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_umessage.py0000666000000000000000000000643314762603732017456 0ustar00rootroot# copyright 2003-2012 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . import sys import email from os.path import join, dirname, abspath from logilab.common.testlib import TestCase, unittest_main from logilab.common.umessage import UMessage, decode_QP, message_from_string DATA = join(dirname(abspath(__file__)), "data") class UMessageTC(TestCase): def setUp(self): if sys.version_info >= (3, 2): pass msg1 = email.message_from_file(open(join(DATA, "test1.msg"), encoding="utf8")) msg2 = email.message_from_file(open(join(DATA, "test2.msg"), encoding="utf8")) else: msg1 = email.message_from_file(open(join(DATA, "test1.msg"))) msg2 = email.message_from_file(open(join(DATA, "test2.msg"))) self.umessage1 = UMessage(msg1) self.umessage2 = UMessage(msg2) def test_get_subject(self): subj = self.umessage2.get("Subject") self.assertEqual(type(subj), str) self.assertEqual(subj, "À LA MER") def test_get_all(self): to = self.umessage2.get_all("To") self.assertEqual(type(to[0]), str) self.assertEqual(to, ["élément à accents "]) def test_get_payload_no_multi(self): payload = self.umessage1.get_payload() self.assertEqual(type(payload), str) def test_get_payload_decode(self): msg = """\ MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: base64 Subject: =?utf-8?q?b=C3=AFjour?= From: =?utf-8?q?oim?= Reply-to: =?utf-8?q?oim?= , =?utf-8?q?BimBam?= X-CW: data To: test@logilab.fr Date: now dW4gcGV0aXQgY8O2dWNvdQ== """ msg = message_from_string(msg) self.assertEqual(msg.get_payload(decode=True), "un petit cöucou") def test_decode_QP(self): test_line = "=??b?UmFwaGHrbA==?= DUPONT" test = decode_QP(test_line) self.assertEqual(type(test), str) self.assertEqual(test, "Raphaël DUPONT") def test_decode_QP_utf8(self): test_line = "=?utf-8?q?o=C3=AEm?= " test = decode_QP(test_line) self.assertEqual(type(test), str) self.assertEqual(test, "oîm ") def test_decode_QP_ascii(self): test_line = "test " test = decode_QP(test_line) self.assertEqual(type(test), str) self.assertEqual(test, "test ") if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_ureports_html.py0000666000000000000000000000545714762603732020561 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """unit tests for ureports.html_writer""" # flake8: noqa: E501 from utils import WriterTC from logilab.common.testlib import TestCase, unittest_main from logilab.common.ureports.html_writer import HTMLWriter class HTMLWriterTC(TestCase, WriterTC): def setUp(self): self.writer = HTMLWriter(1) # Section tests ########################################################### section_base = """

    Section title

    Section\'s description. Blabla bla

    """ section_nested = """
    \n

    Section title

    \n

    Section\'s description.\nBlabla bla

    \n

    Subsection

    \n

    Sub section description

    \n
    \n""" # List tests ############################################################## list_base = """
      \n
    • item1
    • \n
    • item2
    • \n
    • item3
    • \n
    • item4
    • \n
    \n""" nested_list = """
    • blabla

      • 1
      • 2
      • 3

    • an other point
    """ # Table tests ############################################################# table_base = """\n\n\n\n\n\n\n\n\n
    head1head2
    cell1cell2
    \n""" field_table = """\n\n\n\n\n\n\n\n\n\n\n\n\n
    f1v1
    f22v22
    f333v333
    \n""" advanced_table = """\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n
    fieldvalue
    f1v1
    f22v22
    f333v333
    toi perdu ? 
    \n""" # VerbatimText tests ###################################################### verbatim_base = """
    blablabla
    """ if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_ureports_text.py0000666000000000000000000000450414762603732020571 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """unit tests for ureports.text_writer""" from utils import WriterTC from logilab.common.testlib import TestCase, unittest_main from logilab.common.ureports.text_writer import TextWriter class TextWriterTC(TestCase, WriterTC): def setUp(self): self.writer = TextWriter() # Section tests ########################################################### section_base = """ Section title ============= Section\'s description. Blabla bla """ section_nested = """ Section title ============= Section\'s description. Blabla bla Subsection ---------- Sub section description """ # List tests ############################################################## list_base = """ * item1 * item2 * item3 * item4""" nested_list = """ * blabla - 1 - 2 - 3 * an other point""" # Table tests ############################################################# table_base = """ +------+------+ |head1 |head2 | +------+------+ |cell1 |cell2 | +------+------+ """ field_table = """ f1 : v1 f22 : v22 f333: v333 """ advanced_table = """ +---------------+------+ |field |value | +===============+======+ |f1 |v1 | +---------------+------+ |f22 |v22 | +---------------+------+ |f333 |v333 | +---------------+------+ |`toi perdu ?`_ | | +---------------+------+ """ # VerbatimText tests ###################################################### verbatim_base = """:: blablabla """ if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/test_xmlutils.py0000666000000000000000000000527014762603732017524 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . from logilab.common.testlib import TestCase, unittest_main from logilab.common.xmlutils import parse_pi_data class ProcessingInstructionDataParsingTest(TestCase): def test_empty_pi(self): """ Tests the parsing of the data of an empty processing instruction. """ pi_data = " \t \n " data = parse_pi_data(pi_data) self.assertEqual(data, {}) def test_simple_pi_with_double_quotes(self): """ Tests the parsing of the data of a simple processing instruction using double quotes for embedding the value. """ pi_data = """ \t att="value"\n """ data = parse_pi_data(pi_data) self.assertEqual(data, {"att": "value"}) def test_simple_pi_with_simple_quotes(self): """ Tests the parsing of the data of a simple processing instruction using simple quotes for embedding the value. """ pi_data = """ \t att='value'\n """ data = parse_pi_data(pi_data) self.assertEqual(data, {"att": "value"}) def test_complex_pi_with_different_quotes(self): """ Tests the parsing of the data of a complex processing instruction using simple quotes or double quotes for embedding the values. """ pi_data = """ \t att='value'\n att2="value2" att3='value3'""" data = parse_pi_data(pi_data) self.assertEqual(data, {"att": "value", "att2": "value2", "att3": "value3"}) def test_pi_with_non_attribute_data(self): """ Tests the parsing of the data of a complex processing instruction containing non-attribute data. """ pi_data = """ \t keyword att1="value1" """ data = parse_pi_data(pi_data) self.assertEqual(data, {"keyword": None, "att1": "value1"}) # definitions for automatic unit testing if __name__ == "__main__": unittest_main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/test/utils.py0000666000000000000000000000667014762603732015751 0ustar00rootroot# copyright 2003-2011 LOGILAB S.A. (Paris, FRANCE), all rights reserved. # contact http://www.logilab.fr/ -- mailto:contact@logilab.fr # # This file is part of logilab-common. # # logilab-common is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 2.1 of the License, or (at your option) any # later version. # # logilab-common is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License along # with logilab-common. If not, see . """unit tests utilities for ureports""" import sys from io import StringIO from logilab.common.ureports.nodes import ( Section, Text, Table, VerbatimText, List, Paragraph, Link, ) buffers = [StringIO] if sys.version_info < (3, 0): from cStringIO import StringIO as cStringIO from StringIO import StringIO as pStringIO buffers += [cStringIO, pStringIO] class WriterTC: def _test_output(self, test_id, layout, msg=None): for buffercls in buffers: buffer = buffercls() self.writer.format(layout, buffer) got = buffer.getvalue() expected = getattr(self, test_id) try: self.assertMultiLineEqual(got, expected) except Exception: print(f"**** using a {buffer.__class__}") print(f"**** got for {test_id}") print(got) print("**** while expected") print(expected) print("****") raise def test_section(self): layout = Section("Section title", "Section's description.\nBlabla bla") self._test_output("section_base", layout) layout.append(Section("Subsection", "Sub section description")) self._test_output("section_nested", layout) def test_verbatim(self): layout = VerbatimText("blablabla") self._test_output("verbatim_base", layout) def test_list(self): layout = List(children=("item1", "item2", "item3", "item4")) self._test_output("list_base", layout) def test_nested_list(self): layout = List( children=(Paragraph(("blabla", List(children=("1", "2", "3")))), "an other point") ) self._test_output("nested_list", layout) def test_table(self): layout = Table(cols=2, children=("head1", "head2", "cell1", "cell2")) self._test_output("table_base", layout) def test_field_table(self): table = Table(cols=2, klass="field", id="mytable") for field, value in (("f1", "v1"), ("f22", "v22"), ("f333", "v333")): table.append(Text(field)) table.append(Text(value)) self._test_output("field_table", table) def test_advanced_table(self): table = Table(cols=2, klass="whatever", id="mytable", rheaders=1) for field, value in (("field", "value"), ("f1", "v1"), ("f22", "v22"), ("f333", "v333")): table.append(Text(field)) table.append(Text(value)) table.append(Link("http://www.perdu.com", "toi perdu ?")) table.append(Text("")) self._test_output("advanced_table", table) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1741359066.0 logilab_common-2.1.0/tox.ini0000666000000000000000000000460414762603732014566 0ustar00rootroot[tox] envlist=py3,check-manifest,mypy,flake8,black,black-run,yamllint [testenv] deps = pytz pytest git+https://github.com/psycojoker/pytest-capture-deprecatedwarnings commands= {envpython} -m pytest test {posargs} [testenv:docs] basepython = python2 deps = -r docs/requirements-doc.txt commands= {envpython} -m sphinx -b html {toxinidir}/docs {toxinidir}/docs/_build/html {posargs} [testenv:check-manifest] skip_install = true deps = check-manifest commands = {envpython} -m check_manifest {toxinidir} [testenv:mypy] deps = mypy >= 0.761 commands = mypy --ignore-missing-imports logilab [testenv:black] basepython = python3 skip_install = true deps = black >= 21.12b0 commands = black --check . [testenv:black-run] basepython = python3 skip_install = true deps = black >= 21.12b0 commands = black . [testenv:flake8] skip_install = true deps = flake8 >= 3.6 commands = flake8 --show-source [flake8] basepython = python3 format = pylint ignore = W503, E203, E731, E231, E704 max-line-length = 100 exclude = docs/*,.tox/*,./test/data/* [testenv:pypi-publish] basepython = python3 skip_install = true allowlist_externals = rm deps = twine passenv = TWINE_USERNAME TWINE_PASSWORD commands = rm -rf build dist .egg .egg-info python3 setup.py sdist bdist_wheel twine check dist/* twine upload --skip-existing dist/* [testenv:deb-publish] passenv = JENKINS_USER JENKINS_TOKEN basepython = python3 skip_install = true allowlist_externals = rm sh hg python3 deps = httpie commands = hg clean --all --dirs --files rm -rf build dist .egg .egg-info python3 setup.py sdist sh -c "PACKAGE_NAME=$(python3 setup.py --name) && VERSION=$(python3 setup.py --version) && \ cd dist && \ tar xf $PACKAGE_NAME-$VERSION.tar.gz && \ cd $PACKAGE_NAME-$VERSION && \ cp -a {toxinidir}/debian . && \ mk-origtargz --rename ../$PACKAGE_NAME-$VERSION.tar.gz && \ dpkg-buildpackage -us -uc --no-check-builddeps --build=source " sh -c "cd dist && dcmd zip latest.zip *.changes" http -f POST https://{env:JENKINS_USER}:{env:JENKINS_TOKEN}@jenkins.intra.logilab.fr/job/pkg-from-dsc/buildWithParameters DIST=buster source.zip@dist/latest.zip REPO=buster PUBLISH=true [testenv:release-new] basepython = python3 skip_install = true passenv = EDITOR deps = release-new commands = release-new {posargs:-r auto} [testenv:yamllint] skip_install = true deps = yamllint commands = yamllint .