Tags: JuliaAI/MLJ.jl
Tags
[Diff since v0.21.0](v0.21.0...v0.22.0) - (**mildly breaking**) The behaviour of `levels` and `unique` on `CategoricalArray`s has changed. (Such arrays are created in MLJ, for example, by `coerce(array, Multiclass)` or `coerce(array, OrderedFactor)`. ) The `levels` and `unique` methods now return a `CategoricalVector` whereas previously they returned a vector of "raw" values. So, running `levels(array)` previously is equivalent to now running `CategoricalArrays.unwrap.(levels(array))`. The new behaviour is the result of breaking changes in CategoricalArrays.jl, on which MLJ.jl depends (#1172) **Merged pull requests:** - Bump actions/checkout from 4 to 5 (#1175) (@dependabot[bot]) - Regenerate documentation. No new release. (#1182) (@ablaom) - Regenerate dev docs (#1183) (@ablaom) - Restore some integration tests for BetaML. (#1184) (@ablaom) - Bump CategoricalArrays, etc and major testing change 😱 (#1185) (@ablaom) - For a 0.22 release (#1186) (@ablaom) **Closed issues:** - Reinstate CatBoost integraton test (#1092) - Re-instate integration tests for scikit-learn models (#1119) - Reinistate integration tests for SymbolicRegression? (#1152) - Reinstate Outlier detection model s (#1153) - Error when loading `ContinuousEncoder` (#1181)
[Diff since v0.20.9](v0.20.9...v0.21.0) - (**new models**) Add the following models from MLJTransforms.jl and make them immediately available to the MLJ user (she does not need to use `@load` to load them): `OrdinalEncoder`, `FrequencyEncoder`, `TargetEncoder`, `ContrastEncoder`, `CardinalityReducer`, `MissingnessEncoder`. - (**mildly breaking**) Have MLJTransforms.jl, instead of MLJModels.jl, provide the following built-in models, whose behaviour is unchanged: `ContinuousEncoder`, `FillImputer`, `InteractionTransformer`, `OneHotEncoder`, `Standardizer`, `UnivariateBoxCoxTransformer`, `UnivariateDiscretizer`, `UnivariateFillImputer`, `UnivariateTimeTypeToContinuous`, `Standardizer`. **Guide for possible source of breakage:** While it was never necessary to use `@load` to load one of the models in the last list (assuming you have first run `using MLJ`) this is frequently not realised by users, and one sees things like `@load OneHotEncoder pkg=MLJModels`, which this release will break. If such a call is preceded by `using MLJ` or `using MLJTransforms` you can remove the loading command altogether (`OneHotEncoder()` already works), and in any case you can instead use `@load OneHotEncoder pkg=MLJTransforms`. **Merged pull requests:** - Make updates to reflect code reorganisation around addition of MLJTransforms.jl (#1177) (@ablaom) - For a 0.21 release (#1180) (@ablaom) **Closed issues:** - Decision trees from ScikitLearn.jl not available (#545) - Document RecursiveFeatureElimination (#1162)
[Diff since v0.20.7](v0.20.7...v0.20.8) **Merged pull requests:** - Add a draft governance document (#1116) (@ablaom) - Update target_transformations.md (#1137) (@era127) - Update doc page for third party logging platforms (#1139) (@ablaom) - Update the manual (#1140) (@ablaom) - Links on organizations (#1142) (@pebeto) - Initial commit for adding affinity propagation (#1147) (@Yuan-Ru-Lin) - Update docs. No new release. (#1148) (@ablaom) - Remove "DRAFT" tag from GOVERNANCE.md document (#1149) (@ablaom) - Governance: add Advisory Committee members (#1150) (@ablaom) - Some minor fixes (#1151) (@ablaom) - Update mlj_cheatsheet.md (#1156) (@besp-kt) - Integration of new landing page into docs (#1159) (@ablaom) - Bump combat for StatisticalMeasures (#1160) (@ablaom) - Generate updated documentation. No new release. (#1161) (@ablaom) - Create dependabot.yml and update action versions (#1163) (@abhro) - Create docs site favicon (#1164) (@abhro) - Rm symbolic regression from integration tests (#1165) (@ablaom) - For a 0.20.8 release (#1167) (@ablaom) **Closed issues:** - Reexport `CompactPerformanceEvaluation` and `InSample` (#1111) - [tracking] Add default logger to MLJ (#1124) - Add Missingness Encoder Transformer (#1133) - Failed to use TunedModel with precomputed-SVM (#1141) - Error with `RecursiveFeatureElimination` + `EvoTreeClassifier` (#1145) - Dump mention of version number in cheatsheet (#1154) - Remove PartialLeastSquaresRegressor from the docs (#1157) - Regarding issue in recognition of UUID 5ae90465-5518-4432-b9d2-8a1def2f0cab in a registry (#1158)
[Diff since v0.20.6](v0.20.6...v0.20.7) **Merged pull requests:** - Make subpages collapse in manual sidebar. No new release (docs only) (#1131) (@ablaom) - Regenerate documentation. (#1132) (@ablaom) - Update FeatureSelection compat (#1136) (@ablaom) - For a 0.20.7 release (#1138) (@ablaom) **Closed issues:** - Recursive Feature Elimination RFE - Feature Request? (#426) - For 0.17 release (#864) - Transformers that need to see target (eg, recursive feature elimination) (#874) - MLJ API for Missing Imputation ? (#950) - [Tracking issue] Add `raw_training_scores`accessor function (#960) - Extract probability for a tresholded model (#981) - Load data that support the Tables.jl interface (#988) - Add new sk-learn models to the docs (#1066) - Improve documentation by additional hierarchy (#1094) - Link in examples on CV Recursive Feature Elimination into the manual or in the planned tutorial interface. (#1129) - broken link for UnivariateFinite doc string (#1130) - Add pipeline support for `Unsupervised` models that have a target in `fit` (#1134) - InteractionTransformer is missing from the "Transformers and Other..." manual page (#1135)
[Diff since v0.20.5](v0.20.5...v0.20.6) - (**new functionality**) Add `RecursiveFeatureElimination` model wrapper. **Merged pull requests:** - CompatHelper: bump compat for MLJFlow to 0.5 (#1122) (@github-actions[bot]) - Add model wrappers to the Model Browser (#1127) (@ablaom) - For a 0.20.6 release (#1128) (@ablaom) **Closed issues:** - Requesting better exposure to MLJFlux in the model browser (#1110) - Remove `info(rms)` from the cheatsheet (#1117) - Enable entry of model wrappers into the MLJ Model Registry (#1125)
[Diff since v0.20.3](v0.20.3...v0.20.4) - Bump the requirement for MLFlow to 0.4.2. This is technically breaking (but not marked as such because MLJFlow integration is considered expermental). With latest version of MLFlowClient installed, where previously you would define `logger=MLJFlow.Logger("http://127.0.0.1:5000/")` you must now do `logger=MLJFlow.Logger("http://127.0.0.1:5000/api")` or similar; see also https://github.com/JuliaAI/MLFlowClient.jl/releases/tag/v0.5.1. **Merged pull requests:** - Add PartionedLS.jl model to docs and browser (#1103) (@ablaom) - Update documentation. No new release. (#1104) (@ablaom) - Update ROADMAP.md (#1106) (@ablaom) - Use repl language tag for sample (#1107) (@abhro) - Update cheatsheet and workflow docs (#1109) (@ablaom) - Force documentation updates. No new release. (#1112) (@ablaom) - Updates now that MLJ.jl has been moved to the JuliaAI GitHub organization (#1113) (@DilumAluthge) - Remove Telco example (#1114) (@ablaom) - Suppress model-generated warnings in integration tests (#1115) (@ablaom) - Upgrading MLJFlow.jl to v0.4.2 (#1118) (@pebeto) - For a 0.20.4 release (#1120) (@ablaom) **Closed issues:** - Curated list of models (#716) - Migrate MLJ from alan-turing-institute to JuliaAI? (#829) - Update the binder demo for MLJ (#851) - Add wrappers for clustering to get uniform interface (#982) - Confusing Julia code in adding_models_for_general_use.md (#1061) - feature_importances for Pipeline including XGBoost don't work (#1100) - Current performance evaluation objects, recently added to TunedModel histories, are too big (#1105) - Update cheat sheet instance of depracated `@from_network` code (#1108)
[Diff since v0.20.2](v0.20.2...v0.20.3) - Bump compat for MLJFlow to 0.4 to buy into `MLJBase.save` method ambiguity fix (in MLJFlow 0.4.1). **Merged pull requests:** - Clarify `input_scitype` for Static models (#1076) (@ablaom) - Documentation updates (#1077) (@ablaom) - Add integration tests (#1079) (@ablaom) - Test new integration tests. No new release. (#1080) (@ablaom) - Fix the integration tests (#1081) (@DilumAluthge) - Move EvoLinear into [extras] where it belongs (#1083) (@ablaom) - CI: split the integration tests into a separate job (#1086) (@DilumAluthge) - CI tweaks (#1087) (@ablaom) - Update list_of_supported_models for betaml (#1089) (@sylvaticus) - Update ModelDescriptors.toml for BetaML models (#1090) (@sylvaticus) - Update documentation to reflect recent BetaML reorganisation (#1091) (@ablaom) - Replace relevant sections of manual with links to the new MLJModelInterface docs. (#1095) (@ablaom) - Update docs. No new release (#1096) (@ablaom) - Update getting_started.md to avoid error from line 338 (#1098) (@caesquerre) - For a 0.20.3 release (#1102) (@ablaom) **Closed issues:** - Meta issue: lssues for possible collaboration with UCL (#673) - Integration test failures: Classifiers (#939) - Oversample undersample (#983) - Add AutoEncoderMLJ model (part of BetaML) (#1074) - Add new model descriptors to fix doc-generation fail (#1084) - Update list of BetaML models (#1088) - Upate ROADMAP.md (#1093) - Deserialisation fails for wrappers like `TunedModel` when atomic model overloads `save/restore` (#1099)
[Diff since v0.20.1](v0.20.1...v0.20.2) - Replace `MLFlowLogger` with `MLJFlow.Logger`; see [here](https://github.com/JuliaAI/MLJFlow.jl/releases/tag/v0.3.0). So a logger instance is now instantiated with `using MLJFlow; logger = MLJFlow.Logger(baseuri)`. This is technically breaking but not tagged as such, because MLFlow integration is still [experimental](https://alan-turing-institute.github.io/MLJ.jl/dev/logging_workflows/#Logging-Workflows). **Merged pull requests:** - Fix MLJTuning.jl links (#1068) (@jd-foster) - CompatHelper: add new compat entry for Statistics at version 1, (keep existing compat) (#1070) (@github-actions[bot]) - Bump compat: MLJFlow 0.3 (#1072) (@ablaom) - For a 0.20.2 release (#1073) (@ablaom) **Closed issues:** - Export the name `MLJFlow` (#1067) - `evaluate` errors (#1069)
PreviousNext