added InstitutionsURLs page and required backend routes and models
1 unresolved thread
Merge request reports
Activity
assigned to @mohammad.torkashvand
added 1 commit
- 80b13499 - added InstitutionsURLs page and required backend routes and models
added 1 commit
- ffbcdf29 - added InstitutionsURLs page and required backend routes and models
added 1 commit
- 0d992359 - added InstitutionsURLs page and required backend routes and models
added 1 commit
- c396e625 - added InstitutionsURLs page and required backend routes and models
added 1 commit
- 05668b9e - added InstitutionsURLs page and required backend routes and models
added 1 commit
- 35e5ac5a - added InstitutionsURLs page and required backend routes and models
added 1 commit
- c4eec485 - added InstitutionsURLs page and required backend routes and models
added 1 commit
- 8cfd13ee - added InstitutionsURLs page and required backend routes and models
added 1 commit
- 7ec79145 - added InstitutionsURLs page and required backend routes and models
167 202 db.session.commit() 168 203 169 204 205 def transfer_institutions_urls(nren_dict): 206 rows = query_institutions_urls() 207 for row in rows: 208 answer_id, nren_name, year, actual_answer = row 209 if nren_name not in nren_dict: 210 logger.info(f'{nren_name} unknown. Skipping.') 211 continue 212 213 urls = extract_urls(text=actual_answer) We discussed the extract_urls function is a bit iffy because its based on a regex. If we want to use that function, we should at least log all the cases were it gives a different result than the entered text. So I propose you also do a regular json parse (add the [] if they're missing on old data so that you always have an array), and log all the cases where that gives a different result. Then it's easy to evaluate what the extract_urls function has done.
added 1 commit
- cf6a7a50 - added InstitutionsURLs page and required backend routes and models
added 1 commit
- b2757bca - added InstitutionsURLs page and required backend routes and models
mentioned in commit 7c67a1c7
Please register or sign in to reply