repo_name stringlengths 1 62 | dataset stringclasses 1
value | lang stringclasses 11
values | pr_id int64 1 20.1k | owner stringlengths 2 34 | reviewer stringlengths 2 39 | diff_hunk stringlengths 15 262k | code_review_comment stringlengths 1 99.6k |
|---|---|---|---|---|---|---|---|
langfuse-python | github_2023 | python | 384 | langfuse | maxdeichmann | @@ -0,0 +1,366 @@
+from collections import defaultdict
+from typing import Any, Dict, List, Optional, Union, Tuple, Callable
+from uuid import uuid4
+import logging
+
+from langfuse.client import (
+ StatefulSpanClient,
+ StatefulTraceClient,
+ StatefulGenerationClient,
+)
+from langfuse.decorators.error_loggi... | switch to `total` |
langfuse-python | github_2023 | python | 384 | langfuse | maxdeichmann | @@ -0,0 +1,366 @@
+from collections import defaultdict
+from typing import Any, Dict, List, Optional, Union, Tuple, Callable
+from uuid import uuid4
+import logging
+
+from langfuse.client import (
+ StatefulSpanClient,
+ StatefulTraceClient,
+ StatefulGenerationClient,
+)
+from langfuse.decorators.error_loggi... | Remove the input, as this will be too large for our API |
langfuse-python | github_2023 | python | 384 | langfuse | maxdeichmann | @@ -0,0 +1,366 @@
+from collections import defaultdict
+from typing import Any, Dict, List, Optional, Union, Tuple, Callable
+from uuid import uuid4
+import logging
+
+from langfuse.client import (
+ StatefulSpanClient,
+ StatefulTraceClient,
+ StatefulGenerationClient,
+)
+from langfuse.decorators.error_loggi... | there is no user_id on span |
langfuse-python | github_2023 | others | 384 | langfuse | maxdeichmann | @@ -7,14 +7,15 @@ license = "MIT"
readme = "README.md"
[tool.poetry.dependencies]
-python = ">=3.8.1,<4.0"
+python = ">=3.8.1,<3.12"
httpx = ">=0.15.4,<0.26.0"
pydantic = ">=1.10.7, <3.0"
backoff = "^2.2.1"
openai = ">=0.27.8"
wrapt = "1.14"
langchain = { version = ">=0.0.309", optional = true }
chevron = "^... | Do you know by any chance ntil when this implementation is stable? It would be great to also support versions which are a couple of months old. |
langfuse-python | github_2023 | python | 384 | langfuse | maxdeichmann | @@ -0,0 +1,369 @@
+from collections import defaultdict
+from typing import Any, Dict, List, Optional, Union, Tuple, Callable
+from uuid import uuid4
+import logging
+
+from langfuse.client import (
+ StatefulSpanClient,
+ StatefulTraceClient,
+ StatefulGenerationClient,
+)
+from langfuse.decorators.error_loggi... | I think it would be great to have variable declaraitons for these at the top at class level to help our IDEs. |
langfuse-python | github_2023 | python | 384 | langfuse | maxdeichmann | @@ -0,0 +1,369 @@
+from collections import defaultdict
+from typing import Any, Dict, List, Optional, Union, Tuple, Callable
+from uuid import uuid4
+import logging
+
+from langfuse.client import (
+ StatefulSpanClient,
+ StatefulTraceClient,
+ StatefulGenerationClient,
+)
+from langfuse.decorators.error_loggi... | Could you add a quick note that the trace_id is not a uuid or similar but rather a task name? |
langfuse-python | github_2023 | python | 384 | langfuse | maxdeichmann | @@ -0,0 +1,369 @@
+from collections import defaultdict
+from typing import Any, Dict, List, Optional, Union, Tuple, Callable
+from uuid import uuid4
+import logging
+
+from langfuse.client import (
+ StatefulSpanClient,
+ StatefulTraceClient,
+ StatefulGenerationClient,
+)
+from langfuse.decorators.error_loggi... | I think we would want to keep the additional_kwargs in case there is something in there. |
langfuse-python | github_2023 | python | 384 | langfuse | maxdeichmann | @@ -0,0 +1,369 @@
+from collections import defaultdict
+from typing import Any, Dict, List, Optional, Union, Tuple, Callable
+from uuid import uuid4
+import logging
+
+from langfuse.client import (
+ StatefulSpanClient,
+ StatefulTraceClient,
+ StatefulGenerationClient,
+)
+from langfuse.decorators.error_loggi... | can you add a short comment on why we do this at the end and the performance implications for index + query? |
langfuse-python | github_2023 | python | 367 | langfuse | maxdeichmann | @@ -0,0 +1,94 @@
+from contextlib import asynccontextmanager
+from fastapi import FastAPI, Query, BackgroundTasks
+import os
+import sys
+from langfuse import Langfuse
+from langfuse.openai import openai
+import uvicorn
+
+
+@asynccontextmanager
+async def lifespan(app: FastAPI):
+ # Operation on startup
+
+ yiel... | I think we can remove the function parameters here, right? |
langfuse-python | github_2023 | python | 367 | langfuse | maxdeichmann | @@ -0,0 +1,94 @@
+from contextlib import asynccontextmanager
+from fastapi import FastAPI, Query, BackgroundTasks
+import os
+import sys
+from langfuse import Langfuse
+from langfuse.openai import openai
+import uvicorn
+ | We can also remove all the unused imports here |
langfuse-python | github_2023 | others | 367 | langfuse | maxdeichmann | @@ -0,0 +1,24 @@
+# fastapi_example
+
+This is an example FastAPI application showcasing integration with Langfuse for event tracing and response generation.
+
+1. **Shutdown Behavior**: The application defines shutdown logic using FastAPI's lifespan feature. On shutdown, it flushes all events to Langfuse, ensuring dat... | I think we can remove openai here, as this is not used, right? |
langfuse-python | github_2023 | others | 367 | langfuse | maxdeichmann | @@ -0,0 +1,19 @@
+[tool.poetry]
+name = "fastapi-example"
+version = "0.1.0"
+description = ""
+authors = ["ChrisTho23 <christophe.thomassin23@gmail.com>"]
+readme = "README.md"
+
+[tool.poetry.dependencies]
+python = "^3.10"
+fastapi = "^0.109.2"
+uvicorn = "^0.27.1"
+langfuse = "2.13.3a0" | can you use latest langfuse here always? |
langfuse-python | github_2023 | python | 367 | langfuse | maxdeichmann | @@ -0,0 +1,11 @@
+from django.shortcuts import render
+from django.http import JsonResponse, HttpResponseServerError
+from myapp.langfuse_integration import get_response_openai, langfuse | i think here are some unused imports |
langfuse-python | github_2023 | others | 367 | langfuse | maxdeichmann | @@ -0,0 +1,16 @@
+[tool.poetry]
+name = "django-example"
+version = "0.1.0"
+description = ""
+authors = ["ChrisTho23 <christophe.thomassin23@gmail.com>"]
+readme = "README.md"
+
+[tool.poetry.dependencies]
+python = "^3.10"
+django = "^5.0.2"
+langfuse = "^2.13.3" | Can we also use latest here always? |
langfuse-python | github_2023 | python | 367 | langfuse | maxdeichmann | @@ -0,0 +1,16 @@
+import os
+import signal
+import sys
+from .langfuse_integration import langfuse_flush
+
+def shutdwon_handler(*args):
+ """
+ This function handles the shutdown process.
+
+ It calls the langfuse_flush function to flush any pending changes,
+ and then exits the program with a s... | I think we can also listen to sigterm here as this is used to gracefully terminate a process. https://www.baeldung.com/linux/sigint-and-other-termination-signals |
langfuse-python | github_2023 | others | 343 | langfuse | maxdeichmann | @@ -79,3 +79,11 @@ poetry run pre-commit install
- Create PyPi API token: https://pypi.org/manage/account/token/
- Setup: `poetry config pypi-token.pypi your-api-token`
9. Create a release on GitHub with the changelog
+
+### SDK Reference
+
+The SDK reference is generated via pdoc. To update the reference, run... | can you also add a comment on hot wo install the docs group before running this command? |
langfuse-python | github_2023 | python | 332 | langfuse | maxdeichmann | @@ -23,7 +23,7 @@ def __init__(
base_url: str,
version: str,
timeout: int,
- session: httpx.Client,
+ session: httpx.Client = httpx.Client(), | why is this change required? |
langfuse-python | github_2023 | python | 326 | langfuse | maxdeichmann | @@ -59,3 +59,13 @@ def __init__(self, prompt: Prompt):
def compile(self, **kwargs) -> str:
return chevron.render(self.prompt, kwargs)
+
+ def __eq__(self, other): | This here is only used in testing code at the moment, correct? |
langfuse-python | github_2023 | python | 102 | langfuse | maxdeichmann | @@ -524,7 +538,9 @@ def on_llm_end(
)
self.runs[run_id].state = self.runs[run_id].state.update(
- UpdateGeneration(completion=extracted_response, end_time=datetime.now(), usage=llm_usage)
+ UpdateGeneration(
+ completion=ex... | Why do you want to put the token metadata on the metadata as well? We are already tracking that in `usage`. |
langfuse-python | github_2023 | python | 150 | langfuse | maxdeichmann | @@ -652,6 +652,36 @@ def initialize_huggingface_llm(prompt: PromptTemplate) -> LLMChain:
assert observation["output"] is not None
assert observation["output"] != ""
+def test_callback_huggingface_pipeline(): | ```
tests/test_langchain.py::test_callback_huggingface_pipeline FAILED
============================================================================================= FAILURES ==============================================================================================
______________________________________________... |
langfuse-python | github_2023 | python | 215 | langfuse | marcklingen | @@ -274,7 +294,8 @@ def on_agent_action(
if run_id not in self.runs:
raise Exception("run not found")
- self.runs[run_id] = self.runs[run_id].update(UpdateSpan(endTime=datetime.now(), output=action, version=self.version))
+ self.runs[run_id] = self.runs[run_id].upda... | why arent agent actions spans? |
langfuse-python | github_2023 | python | 215 | langfuse | marcklingen | @@ -415,82 +597,228 @@ def _add_default_values(self, body: dict):
body["start_time"] = datetime.now()
return body
- def generation(self, body: CreateGeneration):
+ def generation(
+ self,
+ *,
+ id: typing.Optional[str] = None,
+ name: typing.Optional[str] = Non... | event() does not exist in `class Langfuse` |
langfuse-python | github_2023 | python | 215 | langfuse | marcklingen | @@ -143,16 +158,29 @@ def _get_langfuse_data_from_kwargs(resource: OpenAiDefinition, langfuse: Langfus
modelParameters = {
"temperature": kwargs.get("temperature", 1),
- "maxTokens": kwargs.get("max_tokens", float("inf")),
+ "maxTokens": kwargs.get("max_tokens", float("inf")), # casing? | max_tokens? added comment to be able to comment on this |
langfuse-python | github_2023 | python | 134 | langfuse | maxdeichmann | @@ -466,6 +466,8 @@ def __on_llm_action(
)
if kwargs["invocation_params"]["_type"] in ["anthropic-llm", "anthropic-chat"]:
model_name = "anthropic" # unfortunately no model info by anthropic provided.
+ elif kwargs["invocation_params"]["_type"] == "amazon_bedro... | @kobrinartem thanks for the contribution! Did you test whether Langchain somehow provides the model name? |
langfuse-python | github_2023 | python | 144 | langfuse | maxdeichmann | @@ -11,7 +11,10 @@
class TraceWithDetails(Trace):
observations: typing.List[str] = pydantic.Field(description=("List of observation ids\n"))
- scores: typing.List[str] = pydantic.Field(description=("List of score ids\n"))
+ scores: typing.List[str] = pydantic.Field(
+ description=("List of scor... | is this wanted? |
langfuse-python | github_2023 | python | 103 | langfuse | maxdeichmann | @@ -0,0 +1,16 @@
+from dotenv import load_dotenv
+from langfuse.integrations import openai
+
+load_dotenv()
+
+
+def test_openai_chat_completion():
+ completion = openai.ChatCompletion.create(
+ model="gpt-3.5-turbo", messages=[{"role": "user", "content": "1 + 1 = "}], temperature=0
+ )
+ assert len(com... | can you expose the flush function and call it in the tests? Also, did you try running this locally? |
langfuse-python | github_2023 | python | 103 | langfuse | maxdeichmann | @@ -0,0 +1,83 @@
+import os
+import functools
+from datetime import datetime
+from dotenv import load_dotenv
+
+import openai
+from openai.api_resources import ChatCompletion, Completion
+
+from langfuse import Langfuse
+from langfuse.client import InitialGeneration
+from langfuse.api.resources.commons.types.llm_usage ... | Why is this needed here? So far this worked without |
langfuse-python | github_2023 | python | 103 | langfuse | maxdeichmann | @@ -0,0 +1,83 @@
+import os
+import functools
+from datetime import datetime
+from dotenv import load_dotenv
+
+import openai
+from openai.api_resources import ChatCompletion, Completion
+
+from langfuse import Langfuse
+from langfuse.client import InitialGeneration
+from langfuse.api.resources.commons.types.llm_usage ... | I would maybe rename the class to reflect a bit better of how it works. |
langfuse-python | github_2023 | python | 103 | langfuse | maxdeichmann | @@ -0,0 +1,83 @@
+import os
+import functools
+from datetime import datetime
+from dotenv import load_dotenv
+
+import openai
+from openai.api_resources import ChatCompletion, Completion
+
+from langfuse import Langfuse
+from langfuse.client import InitialGeneration
+from langfuse.api.resources.commons.types.llm_usage ... | maybe `replace_openai_funcs`? |
langfuse-python | github_2023 | python | 103 | langfuse | maxdeichmann | @@ -0,0 +1,40 @@
+import os
+from dotenv import load_dotenv
+from langfuse.integrations import openai
+from langfuse.api.client import FintoLangfuse
+from langfuse.version import __version__ as version
+
+from tests.utils import create_uuid
+
+
+load_dotenv()
+
+api = FintoLangfuse(
+ environment=os.environ["HOST"],... | I just tested this out locally. openai is shown on my end as type any in the IDE, whereas when using the original openai SDK, i get autocompletions. Is there a way to solve that? |
langfuse-python | github_2023 | python | 103 | langfuse | maxdeichmann | @@ -0,0 +1,99 @@
+import os
+import functools
+from datetime import datetime
+from dotenv import load_dotenv
+
+import openai
+from openai.api_resources import ChatCompletion, Completion
+
+from langfuse import Langfuse
+from langfuse.client import InitialGeneration
+from langfuse.api.resources.commons.types.llm_usage ... | Do you think we can instantiate this as a Singleton? I fear our users will import this multiple times and then create multiple Langfuse objects under the hood.
I think one instantiation of the Langfuse object would be ideal. Could we also expose the flush function from there? If yes, we can remove it above in the `_l... |
langfuse-python | github_2023 | python | 103 | langfuse | fxmb | @@ -0,0 +1,99 @@
+import os
+import functools
+from datetime import datetime
+from dotenv import load_dotenv
+
+import openai
+from openai.api_resources import ChatCompletion, Completion | @maxdeichmann we should probably not fully import "lesser known" packages directly to provide more context. Can we change this to `from openai import api_resources` and then call like `api_resources.ChatCompletion`? Just a thought |
langfuse-python | github_2023 | python | 103 | langfuse | fxmb | @@ -0,0 +1,99 @@
+import os
+import functools
+from datetime import datetime
+from dotenv import load_dotenv
+
+import openai
+from openai.api_resources import ChatCompletion, Completion
+
+from langfuse import Langfuse
+from langfuse.client import InitialGeneration
+from langfuse.api.resources.commons.types.llm_usage ... | This will throw an error if key is not present afaik. Is this intended? Rather use `getenv()` and provide fallback logic in case key is not present? It's a personal preference so feel free to ignore. |
langfuse-python | github_2023 | python | 116 | langfuse | github-advanced-security[bot] | @@ -1,5 +1,17 @@
+import os
from uuid import uuid4
+from langfuse.api.client import FintoLangfuse
+
def create_uuid():
return str(uuid4())
+
+
+def get_api():
+ print(os.environ.get("LANGFUSE_PUBLIC_KEY"), os.environ.get("LANGFUSE_SECRET_KEY"), os.environ.get("LANGFUSE_HOST")) | ## Clear-text logging of sensitive information
This expression logs [sensitive data (secret)](1) as clear text.
[Show more details](https://github.com/langfuse/langfuse-python/security/code-scanning/1) |
langfuse-python | github_2023 | python | 100 | langfuse | marcklingen | @@ -36,9 +36,6 @@ class Langfuse(object):
def __init__(
self,
- public_key: str,
- secret_key: str,
- host: Optional[str] = None, | For backwards compatibility, I'd suggest to keep the constructor arguments (public, secret, host) and make them optional. This applies to the SDK and the langchain callback handler.
If set, constructor arguments then take precedence over environment variable. Removing the argument leads to unexpected behavior for exi... |
langfuse-python | github_2023 | python | 100 | langfuse | maxdeichmann | @@ -34,14 +35,17 @@ def __init__(
statefulTraceClient: Optional[StatefulTraceClient] = None,
release: Optional[str] = None,
) -> None:
+ public_key = public_key if public_key else os.environ.get("LF_PK")
+ secret_key = secret_key if secret_key else os.environ.get("LF_SK")
+
... | Should we set the keys here? |
langfuse-python | github_2023 | python | 100 | langfuse | marcklingen | @@ -31,6 +32,9 @@ def __init__(
statefulClient: Optional[StatefulTraceClient | StatefulSpanClient] = None,
release: Optional[str] = None,
) -> None:
+ public_key = public_key if public_key else os.environ.get("LF_PK") | Environment variables will be checked in Langfuse client, no need to do this here again. I'd suggest to just pass public_key and secret_key to Langfuse() |
langfuse-python | github_2023 | python | 100 | langfuse | marcklingen | @@ -71,15 +71,19 @@ def __init__(
self.task_manager = TaskManager()
+ public_key = public_key if public_key else os.environ.get("LF_PK")
+ secret_key = secret_key if secret_key else os.environ.get("LF_SK")
+ host = host if host else os.environ.get("HOST")
+
self.base_url = hos... | This suggests that env var is the only option. I'd suggest `"public_key is required, set as parameter or environment variable "LF_PK""` |
langfuse-python | github_2023 | python | 100 | langfuse | marcklingen | @@ -71,15 +71,19 @@ def __init__(
self.task_manager = TaskManager()
+ public_key = public_key if public_key else os.environ.get("LF_PK")
+ secret_key = secret_key if secret_key else os.environ.get("LF_SK")
+ host = host if host else os.environ.get("HOST")
+
self.base_url = hos... | typo, should be secret_key |
langfuse-python | github_2023 | python | 100 | langfuse | marcklingen | @@ -1,10 +1,13 @@
+import os
import requests
class LangfuseAPI:
- def __init__(self, username=None, password=None, base_url="http://localhost:3000"):
- self.auth = (username, password) if username and password else None
- self.BASE_URL = base_url
+ def __init__(self, username=None, password=No... | Is this necessary? I'd assume that we never create the LangfuseAPI directly but only via Langfuse() |
langfuse-python | github_2023 | python | 100 | langfuse | marcklingen | @@ -67,23 +73,28 @@ def test_flush():
def test_setup_wthout_pk():
# set up the consumer with more requests than a single batch will allow
+ os.environ.pop("LF_PK")
with pytest.raises(ValueError):
- Langfuse(public_key=None, secret_key=os.environ.get("LF_SK"))
+ Langfuse()
+ os.environ["... | Great, also add separate tests for LF_SK and LF_HOST parameter. Goal is to check that overriding env in constructor works |
langfuse-python | github_2023 | others | 4 | langfuse | derino | @@ -0,0 +1,76 @@
+on:
+ pull_request:
+ push:
+ branches:
+ - main
+ - master
+ workflow_dispatch:
+
+concurrency:
+ group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
+ cancel-in-progress: false
+
+jobs:
+ ci:
+ runs-on: ubuntu-latest
+ env: # Or as an environment variable
+... | @maxdeichmann The tests don't seem to be run on the python version that is set up:
https://github.com/langfuse/langfuse-python/actions/runs/6309393965/job/17129223793?pr=107#step:12:16 |
langfuse-python | github_2023 | python | 35 | langfuse | maxdeichmann | @@ -12,6 +12,7 @@
class UpdateSpanRequest(pydantic.BaseModel):
span_id: str = pydantic.Field(alias="spanId")
trace_id: typing.Optional[str] = pydantic.Field(alias="traceId")
+ name: typing.Optional[str] | Good catch! All files in langfuse/api are auto-generated by fern. Added a PR on the core project and will update this PR as well https://github.com/langfuse/langfuse/pull/101
By keeping the openapi docs in core in sync, we can also update the TS SDK accordingly. |
langfuse-python | github_2023 | python | 35 | langfuse | maxdeichmann | @@ -138,8 +139,7 @@ def create_trace():
def create_span():
try:
- new_body = body.copy(update={"id": new_span_id})
- new_body = body.copy(update={"trace_id": new_trace_id})
+ new_body = body.copy(update={"id": new_span_id, "trace_i... | good catch! |
langfuse-python | github_2023 | python | 35 | langfuse | maxdeichmann | @@ -179,8 +179,7 @@ def create_trace():
def create_generation():
try:
- new_body = body.copy(update={"id": new_generation_id})
- new_body = body.copy(update={"trace_id": new_trace_id})
+ new_body = body.copy(update={"id": new_gener... | same here! |
langfuse-python | github_2023 | python | 35 | langfuse | maxdeichmann | @@ -190,8 +189,8 @@ def create_generation():
self.log.exception(e)
raise e
- self.task_manager.add_task(new_generation_id, create_generation)
self.task_manager.add_task(new_trace_id, create_trace)
+ self.task_manager.add_task(new_generation_... | Ordering does not matter, but more logical to have it this way |
langfuse-python | github_2023 | python | 35 | langfuse | maxdeichmann | @@ -358,7 +357,7 @@ def update(self, body: UpdateGeneration):
def task():
try:
- new_body = body.copy(update={"generation_id": self.id})
+ new_body = body.copy(update={"generation_id": self.id, "trace_id": self.trace_id}) | Why this change? It does not destroy anything but did it not work without? |
langfuse-python | github_2023 | python | 35 | langfuse | maxdeichmann | @@ -424,3 +426,76 @@ def test_create_trace_with_id_and_generation():
span = trace["observations"][0]
assert span["name"] == "generation"
assert span["traceId"] == trace["id"]
+
+
+def test_end_generation():
+ langfuse = Langfuse(os.environ.get("LF_PK"), os.environ.get("LF_SK"), os.environ.get("HOST"))... | Do you think these type of tests work reliably as they base on time? Maybe we should just check that the endtime is not null? |
langfuse-python | github_2023 | python | 35 | langfuse | maxdeichmann | @@ -401,6 +416,14 @@ def task():
except Exception as e:
self.log.exception(e)
+ def end(self):
+ try:
+ end_time = datetime.now()
+ self.update(UpdateGeneration(endTime=end_time)) | I missed this one here: i think we need to return this function here so that end() returns a stateful client (also above). Otherwise good for me. |
laravel-filament-flexible-content-blocks | github_2023 | others | 25 | statikbe | sten | @@ -68,6 +68,7 @@ Here is a brief overview of the choices made:
- `spatie/laravel-sluggable`: for slugs
- `spatie/laravel-translatable`: for translations as this works together with the first party filament translatable package.
- `dereuromark/media-embed`: to support video embeds of [various media services](https:... | There is no real dependency on this library. It is only used in the examples. So I do not like to include it. You can use whatever you want.
In the future, I am planning to implement support for additional fields for SEO metadata types, like articles, pages, how-to lists, etc. Maybe other libs are more useful for t... |
laravel-filament-flexible-content-blocks | github_2023 | others | 25 | statikbe | sten | @@ -38,6 +38,7 @@
],
"require": {
"php": "^8.1",
+ "artesaos/seotools": "^1.2", | Please, remove. See comment above. |
laravel-filament-flexible-content-blocks | github_2023 | php | 25 | statikbe | sten | @@ -0,0 +1,17 @@
+<?php
+
+namespace App\View\Components;
+
+use Illuminate\View\Component;
+use Illuminate\View\View;
+
+class baseLayour extends Component | Typo in class name: BaseLayout |
laravel-filament-flexible-content-blocks | github_2023 | php | 6 | statikbe | sten | @@ -134,9 +134,9 @@ public function getChildComponents(): array
TextInput::make(static::FIELD_BUTTON_LABEL)
->label(trans('filament-flexible-content-blocks::filament-flexible-content-blocks.form_component.content_blocks.call_to_action_button_label'))
... | zet `->inline(false)` op een new line |
timefold-solver | github_2023 | java | 1,463 | TimefoldAI | zepfred | @@ -3,32 +3,52 @@
import java.util.IdentityHashMap;
import java.util.Map;
+import ai.timefold.solver.core.api.domain.solution.PlanningSolution;
import ai.timefold.solver.core.impl.bavet.uni.AbstractForEachUniNode;
+import ai.timefold.solver.core.impl.bavet.uni.AbstractForEachUniNode.LifecycleOperation;
public a... | Cool! |
timefold-solver | github_2023 | java | 1,463 | TimefoldAI | zepfred | @@ -3,32 +3,52 @@
import java.util.IdentityHashMap;
import java.util.Map;
+import ai.timefold.solver.core.api.domain.solution.PlanningSolution;
import ai.timefold.solver.core.impl.bavet.uni.AbstractForEachUniNode;
+import ai.timefold.solver.core.impl.bavet.uni.AbstractForEachUniNode.LifecycleOperation;
public a... | We are passing an annotation class here. I'm unsure how `nodeNetwork.getForEachNodes()` will locate the solution class. |
timefold-solver | github_2023 | java | 1,463 | TimefoldAI | zepfred | @@ -83,7 +80,7 @@ public UniDataStream<Solution_, A> addNull() {
throw new UnsupportedOperationException();
}
- public AbstractDataset<Solution_, UniTuple<A>> createDataset() {
+ public UniDataset<Solution_, A> createDataset() { | Should we avoid the use of specialized classes here? |
timefold-solver | github_2023 | java | 1,463 | TimefoldAI | zepfred | @@ -10,24 +9,23 @@
import ai.timefold.solver.core.api.score.stream.Joiners;
import ai.timefold.solver.core.impl.domain.solution.descriptor.SolutionDescriptor;
-import ai.timefold.solver.core.impl.move.streams.maybeapi.stream.DataStreamFactory;
-import ai.timefold.solver.core.impl.move.streams.maybeapi.stream.Soluti... | Should this class be abstract, or will there be no default implementation? |
timefold-solver | github_2023 | java | 1,463 | TimefoldAI | zepfred | @@ -12,48 +11,40 @@
import ai.timefold.solver.core.impl.bavet.uni.ForEachExcludingUnassignedUniNode;
import ai.timefold.solver.core.impl.bavet.uni.ForEachFromSolutionUniNode;
import ai.timefold.solver.core.impl.bavet.uni.ForEachIncludingUnassignedUniNode;
-import ai.timefold.solver.core.impl.bavet.uni.ForEachStaticU... | Why not to add specialized classes, one for the predicate and another for the FromSolutionValueCollectingFunction? |
timefold-solver | github_2023 | java | 1,463 | TimefoldAI | zepfred | @@ -0,0 +1,136 @@
+package ai.timefold.solver.core.impl.move.streams;
+
+import java.util.Iterator;
+import java.util.NoSuchElementException;
+import java.util.Objects;
+import java.util.Set;
+import java.util.function.BiPredicate;
+
+import ai.timefold.solver.core.impl.bavet.common.tuple.UniTuple;
+import ai.timefold.... | Nice! |
timefold-solver | github_2023 | java | 1,463 | TimefoldAI | zepfred | @@ -81,11 +81,11 @@ public abstract class AbstractScoreDirector<Solution_, Score_ extends Score<Scor
private int workingInitScore = 0;
private String undoMoveText;
- // Null when tracking disabled
- private final boolean trackingWorkingSolution;
private final SolutionTracker<Solution_> solutionTr... | It cannot be null |
timefold-solver | github_2023 | java | 1,463 | TimefoldAI | zepfred | @@ -59,6 +59,7 @@ public BavetConstraintStreamScoreDirector(
@Override
public void setWorkingSolution(Solution_ workingSolution) {
session = scoreDirectorFactory.newSession(workingSolution, constraintMatchPolicy, derived);
+ session.initialize(workingSolution); | Is this necessary at this stage of the implementation? |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,222 @@
+package ai.timefold.solver.core.impl.bavet.common;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.TreeMap;
+import java.util.function.BiConsumer;
+import java.util.functio... | Let's use `String::format` |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,222 @@
+package ai.timefold.solver.core.impl.bavet.common;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.TreeMap;
+import java.util.function.BiConsumer;
+import java.util.functio... | Please apply the previous comment to all similar occurrences |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,222 @@
+package ai.timefold.solver.core.impl.bavet.common;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.TreeMap;
+import java.util.function.BiConsumer;
+import java.util.functio... | Should it be zero instead of Integer.MIN_VALUE? |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -26,22 +26,23 @@ public void insert(A a) {
@Override
public void update(A a) {
- UniTuple<A> tuple = tupleMap.get(a);
+ var tuple = tupleMap.get(a);
if (tuple == null) { // The tuple was never inserted because it did not pass the filter.
insert(a);
} else if (... | Why we don't need to filter before removing? |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -26,22 +26,23 @@ public void insert(A a) {
@Override
public void update(A a) {
- UniTuple<A> tuple = tupleMap.get(a);
+ var tuple = tupleMap.get(a);
if (tuple == null) { // The tuple was never inserted because it did not pass the filter.
insert(a);
} else if (... | The comment is inconsistent |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -3,20 +3,21 @@
import ai.timefold.solver.core.impl.bavet.common.tuple.TupleLifecycle;
import ai.timefold.solver.core.impl.bavet.common.tuple.UniTuple;
-public final class ForEachIncludingUnassignedUniNode<A> extends AbstractForEachUniNode<A> {
+public sealed class ForEachUniNode<A> | Since I don't have deep knowledge of the ConstraintStream ecosystem, I'm missing comments on the node classes that briefly describe their roles. |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -29,23 +30,27 @@ public int layerCount() {
return layeredNodes.length;
}
- @SuppressWarnings("unchecked")
- public AbstractForEachUniNode<Object>[] getApplicableForEachNodes(Class<?> factClass) {
+ public Stream<AbstractForEachUniNode<?>> getForEachNodes() {
+ return declaredClassToNo... | Propagate seems to be a common term used in CS. I'm unsure if the method name change is good idea. |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -134,4 +166,8 @@ public VariableDescriptorAwareScoreDirector<Solution_> getScoreDirector() {
return scoreDirector;
}
+ public void resetWorkingSolution(Solution_ workingSolution) { | Please add a comment about why we are adding an empty method |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,46 @@
+package ai.timefold.solver.core.impl.move.streams.dataset.common.bridge;
+
+import java.util.Objects;
+
+import ai.timefold.solver.core.impl.bavet.common.TupleSource;
+import ai.timefold.solver.core.impl.move.streams.dataset.AbstractDataStream;
+import ai.timefold.solver.core.impl.move.streams.dataset... | What is the meaning of Aft? |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,46 @@
+package ai.timefold.solver.core.impl.move.streams.dataset.common.bridge;
+
+import java.util.Objects;
+
+import ai.timefold.solver.core.impl.bavet.common.TupleSource;
+import ai.timefold.solver.core.impl.move.streams.dataset.AbstractDataStream;
+import ai.timefold.solver.core.impl.move.streams.dataset... | ```suggestion
var that = (AftBridgeUniDataStream<?, ?>) o;
``` |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,76 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Set;
+
+import ai.timefold.solver.core.impl.bavet.common.BavetStream;
+import ai.timefold.solver.core.impl.bavet.common.TupleSource;
+import ai.timefold.solver.core.impl.... | Let's use `String::format` in all similar occurrences |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,76 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import java.util.Collections;
+import java.util.IdentityHashMap;
+import java.util.Map;
+import java.util.Objects;
+
+import ai.timefold.solver.core.impl.bavet.AbstractSession;
+import ai.timefold.solver.core.impl.bavet.NodeNetwork;
+import... | Why are we overriding it? |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,64 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import java.util.ArrayList;
+import java.util.LinkedHashMap;
+import java.util.LinkedHashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.function.Consumer;
+
+import ai.timefold.solver.core.impl.bavet.NodeNetwork;
+imp... | I wonder if we should change the logic to return the list of active datasets. This way, we wouldn't need to pass activeDatasetStreamSet as a parameter. |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,147 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.Consumer;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+imp... | Let's use `String::format` |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,52 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import java.util.Objects;
+import java.util.function.Predicate;
+
+import ai.timefold.solver.core.impl.bavet.common.tuple.TupleLifecycle;
+import ai.timefold.solver.core.impl.bavet.common.tuple.UniTuple;
+import ai.timefold.solver.core.impl... | ```suggestion
this.predicate = Objects.requireNonNull(predicate, "The predicate (null) cannot be null.");
``` |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,115 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import java.util.Collection;
+import java.util.Objects;
+import java.util.Set;
+import java.util.function.Predicate;
+
+import ai.timefold.solver.core.impl.bavet.common.TupleSource;
+import ai.timefold.solver.core.impl.bavet.common.tuple.T... | We are using `Objects::requireNonNull` in some places and in others not. |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,112 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import java.util.Objects;
+import java.util.Set;
+import java.util.function.BiPredicate;
+
+import ai.timefold.solver.core.impl.bavet.bi.joiner.DefaultBiJoiner;
+import ai.timefold.solver.core.impl.bavet.common.BavetAbstractConstraintStrea... | I would use an if block instead of a ternary operator |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,178 @@
+package ai.timefold.solver.core.impl.move.streams.generic;
+
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import java.util.Objects;
+
+import ai.timefold.solver.core.api.domain.solution.PlanningSolution;
+import ai.timefold.solver.core.api.domain.variable.Plan... | The class is very well-documented. I missed these types of comments in the other move classes and data stream classes. |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,8 @@
+package ai.timefold.solver.core.impl.score.stream.bavet.common; | I wonder if these stream classes should be part of the package ` ai.timefold.solver.core.impl.bavet.common` |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -71,7 +71,8 @@ public <Stream_ extends BavetAbstractConstraintStream<Solution_>> Stream_ share(
* {@link BavetAbstractConstraintStream} implement equals/hashcode ignoring child streams.
* <p>
* {@link BavetConstraintSessionFactory#buildSession(Object, ConstraintMatchPolicy, boolean, Consumer)} need... | Please adjust the comment |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,107 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import static org.assertj.core.api.Assertions.assertThat;
+
+import ai.timefold.solver.core.impl.domain.solution.descriptor.DefaultPlanningListVariableMetaModel;
+import ai.timefold.solver.core.impl.domain.solution.descriptor.DefaultPlanni... | We might use this class in other places and will need to make it available to the different classes. |
timefold-solver | github_2023 | java | 1,349 | TimefoldAI | zepfred | @@ -0,0 +1,107 @@
+package ai.timefold.solver.core.impl.move.streams.dataset;
+
+import static org.assertj.core.api.Assertions.assertThat;
+
+import ai.timefold.solver.core.impl.domain.solution.descriptor.DefaultPlanningListVariableMetaModel;
+import ai.timefold.solver.core.impl.domain.solution.descriptor.DefaultPlanni... | Nice! |
timefold-solver | github_2023 | java | 1,454 | TimefoldAI | winklerm | @@ -500,6 +501,53 @@ void compareWithConstraintMatchesAndMatchAnalysis() {
});
}
+ @Test
+ void compareMatchCountsWithDifferentFetchPolicies() {
+ var constraintPackage = "constraintPackage";
+ var constraintName1 = "constraint1";
+ var constraintName2 = "constraint2";
+ ... | Q: Should `constraintAnalysisMap2` use `FETCH_ALL` for all entries? Or is the mixing of policies intentional? |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -27,7 +27,7 @@ Gradle::
--
[source,shell,subs=attributes+]
----
-curl https://timefold.ai/product/upgrade/upgrade-timefold.gradle > upgrade-timefold.gradle ; gradle -Dorg.gradle.jvmargs=-Xmx2G --init-script upgrade-timefold.gradle rewriteRun -DtimefoldSolverVersion={timefold-solver-version} ; rm upgrade-timefold.... | This doesn't work for me.
example gives 404, because `upgrade-timefold.gradle` doesn't exist?
https://raw.githubusercontent.com/TimefoldAI/timefold-solver/refs/tags/v1.19.0/migration/upgrade-timefold.gradle |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -33,7 +33,7 @@ Gradle::
--
[source,shell,subs=attributes+]
----
-curl https://timefold.ai/product/upgrade/upgrade-timefold.gradle > upgrade-timefold.gradle ; gradle -Dorg.gradle.jvmargs=-Xmx2G --init-script upgrade-timefold.gradle rewriteRun -DtimefoldSolverVersion={timefold-solver-version} ; rm upgrade-timefold.... | Ditto other comment |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -140,22 +140,51 @@ Some solver configurations use the `Random` instance a lot more than others.
For example, Simulated Annealing depends highly on random numbers, while Tabu Search only depends on it to deal with score ties.
The environment mode influences the seed of that `Random` instance.
-These are the envir... | ```suggestion
any two runs of the same dataset with the same solver configuration must have the same result at every step.
``` |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -180,76 +209,78 @@ Any variable that changed between the "before move" solution and the "after move
Any variable that changed between the "after move" solution and "after undo move" solution without either a
`beforeVariableChanged` or `afterVariableChanged` would be reported here.
-This mode is <<environmentMode... | ```suggestion
The `FULL_ASSERT` mode turns on all assertions and will fail-fast on a bug in a Move implementation, a constraint, the engine itself, ...
```
I think the extra explainer is not needed and makes the sentence harder to read. |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -180,76 +209,78 @@ Any variable that changed between the "before move" solution and the "after move
Any variable that changed between the "after move" solution and "after undo move" solution without either a
`beforeVariableChanged` or `afterVariableChanged` would be reported here.
-This mode is <<environmentMode... | ```suggestion
It is also intrusive because it calls the method `calculateScore()` more frequently than a non-assert mode, making the `FULL_ASSERT` mode very slow.
``` |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -180,76 +209,78 @@ Any variable that changed between the "before move" solution and the "after move
Any variable that changed between the "after move" solution and "after undo move" solution without either a
`beforeVariableChanged` or `afterVariableChanged` would be reported here.
-This mode is <<environmentMode... | Remove these 2 with the suggestions above. |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -180,76 +209,78 @@ Any variable that changed between the "before move" solution and the "after move
Any variable that changed between the "after move" solution and "after undo move" solution without either a
`beforeVariableChanged` or `afterVariableChanged` would be reported here.
-This mode is <<environmentMode... | Should this be a TIP to make it stand out?
|
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -180,76 +209,78 @@ Any variable that changed between the "before move" solution and the "after move
Any variable that changed between the "after move" solution and "after undo move" solution without either a
`beforeVariableChanged` or `afterVariableChanged` would be reported here.
-This mode is <<environmentMode... | Do we have a list of the assertions somewhere? Because "ALL" (FULL_ASSERT) and "several" (NONE_INTRUSIVE_FULL_ASSERT) isn't very telling.
Side note: having to say we only activate "several assertions" for NONE_INTRUSIVE_FULL_ASSERT is a bit weird. (But probably legacy?) |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -180,76 +209,78 @@ Any variable that changed between the "before move" solution and the "after move
Any variable that changed between the "after move" solution and "after undo move" solution without either a
`beforeVariableChanged` or `afterVariableChanged` would be reported here.
-This mode is <<environmentMode... | Since Seeds aren't really mentioned clearly before, I wouldn't introduce the concept here. |
timefold-solver | github_2023 | others | 1,437 | TimefoldAI | TomCools | @@ -180,76 +209,78 @@ Any variable that changed between the "before move" solution and the "after move
Any variable that changed between the "after move" solution and "after undo move" solution without either a
`beforeVariableChanged` or `afterVariableChanged` would be reported here.
-This mode is <<environmentMode... | I am a bit surprised, if it is only "negligibly slower", I would not expect it to disable concurrency optimizations (that sounds bigger than just negligible to the untrained eye). |
timefold-solver | github_2023 | java | 1,431 | TimefoldAI | Christopher-Chianelli | @@ -297,6 +297,10 @@ public TerminationConfig withTerminationClass(Class<? extends Termination> termi
return this;
}
+ public @NonNull TerminationConfig withDiminishedReturnsConfig() { | My understanding is this method is to use Diminished Returns with the default configuration? Maybe `withDefaultDiminishedReturnsConfig()`? `withDiminishedReturnsConfig()` looks odd to me without an argument. |
timefold-solver | github_2023 | java | 1,431 | TimefoldAI | Christopher-Chianelli | @@ -88,18 +92,32 @@ public boolean isAssertShadowVariablesAreNotStaleAfterStep() {
@Override
public void solvingStarted(SolverScope<Solution_> solverScope) {
- phaseTermination.solvingStarted(solverScope);
phaseLifecycleSupport.fireSolvingStarted(solverScope);
}
@Override
pu... | Can we fail fast when the solver is being built? This will fail during solving. |
timefold-solver | github_2023 | java | 1,431 | TimefoldAI | Christopher-Chianelli | @@ -40,26 +41,30 @@ public abstract class AbstractSolver<Solution_> implements Solver<Solution_> {
protected final BestSolutionRecaller<Solution_> bestSolutionRecaller;
// Note that the DefaultSolver.basicPlumbingTermination is a component of this termination.
// Called "solverTermination" to clearly dis... | Maybe rename field to `universalTermination`? Since there is a class called `SolverTermination` now. |
timefold-solver | github_2023 | java | 1,431 | TimefoldAI | Christopher-Chianelli | @@ -3,89 +3,110 @@
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
+import java.util.Objects;
import ai.timefold.solver.core.impl.phase.scope.AbstractPhaseScope;
import ai.timefold.solver.core.impl.phase.scope.AbstractStepScope;
import ai.timefold.solver.core.impl.solver.scope.Solver... | The `step` methods do not get called for the solver terminations? How does the solver terminations get updated then? |
timefold-solver | github_2023 | java | 1,431 | TimefoldAI | Christopher-Chianelli | @@ -3,89 +3,110 @@
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
+import java.util.Objects;
import ai.timefold.solver.core.impl.phase.scope.AbstractPhaseScope;
import ai.timefold.solver.core.impl.phase.scope.AbstractStepScope;
import ai.timefold.solver.core.impl.solver.scope.Solver... | What happens to universal terminations? Do they get put in both lists? |
timefold-solver | github_2023 | java | 1,431 | TimefoldAI | Christopher-Chianelli | @@ -0,0 +1,132 @@
+package ai.timefold.solver.core.impl.solver;
+
+import ai.timefold.solver.core.api.score.buildin.simple.SimpleScore;
+import ai.timefold.solver.core.api.solver.SolverFactory;
+import ai.timefold.solver.core.config.constructionheuristic.ConstructionHeuristicPhaseConfig;
+import ai.timefold.solver.core... | Outdated comment; no step count termination is set. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.