Replies: 6 comments 8 replies
-
We adjust the commodity balance dual by adding the highest capacity dual. But let's separate issues here. Reduced cost - by its nature - includes scarcity pricing. What we are doing is subtracting the impact of the scarcity-priced commodities, and then adding back the impact of commodity prices after scarcity pricing is removed (I.e. after our calculation to remove scarcity pricing).
The purpose is to compare assets on a more level playing field, where they are all compared against a demand profile with specific potential utilisation. The first tranche will usually have high potential utilisation (not constrained by lack of demand). The last tranche will usually have low potential utilisation, with a much more limited range of options to economically serve it. Though, worth noting this solution is still rough and heuristic. But a more "true" solution - still with shortcomings - is using game theory. But solution times for such problems are too slow for us. So - a compromise - but the best I've come up with so far.
We consider the whole demand profile. Not just the unmet bit (which is a separate issue - note we don't use this so much anymore - algorithm changed since a few weeks ago). Basically, the point here is that the bottom bit of the demand profile has high utilisation (I.e more time slices will have demand in them between 0 and 1/5 of the peak demand). Might be easier if I draw a picture of this!
Sorry yes this is very rough. Can we just use the number of time slices divided by 2 for now. Will need to limit or put this as a user input in future.
All time slices are in each tranche. Though as we get to higher tranches, there may be no demand left in one or many time slices. Basically, time slices are on the horizontals axis, tranche (demand level) is on the vertical axis. Overall, good to question the "tranche" logic, and processing is tricky I think. It's dynamic, adapting according to what demand the last best process could serve, and what demand is left. This will be worth a discussion before you start on it.
After MVP fine. And yes good to have an issue.
What am I looking at? Just dispatch right?
Yeah, process marginal prices are there in the investment.tex doc, related to reduced costs. They are in a pre-calculation step. Indeed, this has changed since last in-depth discussions.
I think update using the latex docs I have. With a strong caveat that it a work in progress and will change. No point having out of date docs. |
Beta Was this translation helpful? Give feedback.
-
|
Hi Alex,
Your load factor calc looks correct. ChatGPT wrong.
Re investment appraisal yes we’ll need demand by time slice, and time slice lengths I think - it’s an optimisation sub-problem very similar to dispatch - which uses these things in various ways. I guess this isn’t written out in investment.tex (I.e the constraints aren’t written out). I had thought they’d be similar enough to dispatch that this would not be needed?
Re books - not really I’m afraid. There’s documentation out there for other models but ours is different enough such that these won’t be that helpful.
…________________________________
From: Alex Dewar ***@***.***>
Sent: Friday, June 27, 2025 10:09:53 AM
To: EnergySystemsModellingLab/MUSE_2.0 ***@***.***>
Cc: Hawkes, Adam D ***@***.***>; Comment ***@***.***>
Subject: Re: [EnergySystemsModellingLab/MUSE_2.0] Questions about agent investment in new model (Discussion #661)
Actually, I think I misspoke there.
Thinking about candidate assets: λ is the shadow commodity price, given by the commodity balance duals. We calculate our commodity prices by adding the highest capacity dual to commodity balance duals. I think this is what you mean by "adjusted commodity prices" ( λ ∗ ), right? But in the equation for calculating R C c a , r , t ∗ you're subtracting the sum of all λ ∗ from the sum of all λ . Isn't the difference between λ and λ ∗ for any given commodity just the highest capacity dual? I appreciate that this is only for candidate assets when scarcity pricing is disabled.
We subtract the scarcity-inclusive prices, and add back the scarcity-removed prices. It just adjusts RC a bit (to remove scarcity element of price). True maybe there is some other way to represent this with the capacity duals, etc. But I think the way I've done it is more intuitive - working with the prices.
Okey doke. In that case, we can just store both sets of prices internally for now. If it becomes clear later that we can optimise one of them away, then we can do that.
This is starting to make sense... Why is the first tranche 0-4 though? Do we not divide up the peak demand by number of tranches, which would give 3.333? If not, how can we calculate the bounds of the tranches? I can see that in this case the 4 is just the lowest load, but what's the general solution?
Yes yes I just made up the 0->4 thing as an example. (I chose 0->4 because all time slices are at 100% load factor in that range for this example). Maybe for the MVP just make 1st tranche 1/2 the peak load, and make an issue to revisit the sizing of the tranches? The choice here probably matters a fair bit, but is one issue among many so perhaps not worth dwelling on now.
Ah ok. I'll do that.
Just to make sure I'm calculating the load factor correctly: If we're looking at the load for the first tranche (let's say it's 0-4) and the time slices have loads of [2, 4, 4] respectively, in that case would we just calculate load factor by taking the average of the loads (which are already adjusted for time slice length) and dividing by the maximum capacity of the tranche, i.e.:
$$ \frac{2 + 4 + 4}{3\times 4} = 0.8333 $$
I asked ChatGPT about it (I don't normally, but couldn't find any good human-authored resources out there!) and it suggested adding the time slice lengths for "active" time slices together (i.e. those with load >0). In this example, they're all active, so the load factor would be 1. My approach makes more sense to me, but I just wanted to check there wasn't anything to ChatGPT's suggestion?
Incidentally, if you know of any resources (links, books etc.) that has background info on energy systems, that would be helpful. A lot of it's fairly simple, but I obviously don't have any background knowledge on this stuff. It took me a while to realise that dividing demand into tranches is a standard thing to do, for example!
Final question: presumably after we've worked out the tranches on the basis of load (FlowPerYear), we then convert back to demand (Flow) by multiplying by time slice length for the investment appraisal step?
It all looks good to me. Just checking whether you were planning to to much more work on it. We'll start adding bits to the repo, then.
Not planning much more work, so I think go ahead.
👍
—
Reply to this email directly, view it on GitHub<#661 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AC37JLN4ABPHUZFHBTTZDNL3FTU4DAVCNFSM6AAAAACAARHE62VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTGNJZGQYDMMQ>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
|
Re Flow versus FlowPerYear. I think the answer is both. You'll need FlowPerYear to figure out how much Flow is in the tranche. You'll need Flow as this is what drives costs.
…________________________________
From: Alex Dewar ***@***.***>
Sent: Friday, June 27, 2025 1:48:17 PM
To: EnergySystemsModellingLab/MUSE_2.0 ***@***.***>
Cc: Hawkes, Adam D ***@***.***>; Comment ***@***.***>
Subject: Re: [EnergySystemsModellingLab/MUSE_2.0] Questions about agent investment in new model (Discussion #661)
Your load factor calc looks correct. ChatGPT wrong.
👍
Re investment appraisal yes we’ll need demand by time slice, and time slice lengths I think - it’s an optimisation sub-problem very similar to dispatch - which uses these things in various ways. I guess this isn’t written out in investment.tex (I.e the constraints aren’t written out). I had thought they’d be similar enough to dispatch that this would not be needed?
I think it's clear enough, but we'll ask if we have questions. Just wanted to double-check that we wanted demand tranches in units of Flow cf. FlowPerYear for this step.
Re books - not really I’m afraid. There’s documentation out there for other models but ours is different enough such that these won’t be that helpful.
That's what I feared. Oh well. We'll just have to make sure we document the code nice and clearly.
—
Reply to this email directly, view it on GitHub<#661 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AC37JLOVNSXN3VN62RU3G433FUOPDAVCNFSM6AAAAACAARHE62VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTGNJZGY2DMMQ>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
|
We don’t need to worry about the load factor of the tranches - we $always$ want to use the lowest tranche first (which will $always$ have the highest load factor - or equal highest). All we need to decide is the load level to cut off the tranche. For MVP we’re cutting at half of peak load (with an issue to revisit this).
I did note in the description that we go for highest load factor first, but it’s something that just happens automatically if you go for the lowest tranche first.
As an aside, it is actually possible that a later tranche has a higher load factor. This is because we calculate tranche by tranche dynamically, because the asset chosen in for the last tranche may not have served all the demand in that tranche, and that unmet demand is added back to the remaining demand to be served, before considering the next tranche.
Probably easier if I drew a picture of this!
…________________________________
From: Alex Dewar ***@***.***>
Sent: Friday, June 27, 2025 5:04:47 PM
To: EnergySystemsModellingLab/MUSE_2.0 ***@***.***>
Cc: Hawkes, Adam D ***@***.***>; Comment ***@***.***>
Subject: Re: [EnergySystemsModellingLab/MUSE_2.0] Questions about agent investment in new model (Discussion #661)
Re Flow versus FlowPerYear. I think the answer is both. You'll need FlowPerYear to figure out how much Flow is in the tranche. You'll need Flow as this is what drives costs.
Ok, thanks 👍.
Apologies about the slew of questions but I do have one last one... Am I right in thinking that the first tranche will always have the highest load factor, at least with the way we've formulated it? If so, we can skip the load factor calculation for now. Presumably there are cases where this isn't the case, otherwise no one would bother calculating it. But in our case, as I understand it, the load for every time slice in the second tranche will always be less than or equal to the load in the first tranche (for a given time slice, the second tranche will only have any load at all if the first tranche is maxed out, otherwise it'll be zero).
Is that right or am I missing something?
—
Reply to this email directly, view it on GitHub<#661 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AC37JLMTMVRZLYPZ5LUAJ3L3FVFP7AVCNFSM6AAAAACAARHE62VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTGNJZHAYTONA>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
|
Great thanks!
…________________________________
From: Alex Dewar ***@***.***>
Sent: Friday, June 27, 2025 5:53:18 PM
To: EnergySystemsModellingLab/MUSE_2.0 ***@***.***>
Cc: Hawkes, Adam D ***@***.***>; Comment ***@***.***>
Subject: Re: [EnergySystemsModellingLab/MUSE_2.0] Questions about agent investment in new model (Discussion #661)
We don’t need to worry about the load factor of the tranches - we $always$ want to use the lowest tranche first (which will $always$ have the highest load factor - or equal highest). All we need to decide is the load level to cut off the tranche. For MVP we’re cutting at half of peak load (with an issue to revisit this).
Great, sounds doable 👍.
I did note in the description that we go for highest load factor first, but it’s something that just happens automatically if you go for the lowest tranche first.
As an aside, it is actually possible that a later tranche has a higher load factor. This is because we calculate tranche by tranche dynamically, because the asset chosen in for the last tranche may not have served all the demand in that tranche, and that unmet demand is added back to the remaining demand to be served, before considering the next tranche.
Interesting... somewhat counter-intuitive. I'll start with the first tranche for now and we can build up the model from there.
I've got some (very rough) code for most of this now. Exciting to see it all coming together!
—
Reply to this email directly, view it on GitHub<#661 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AC37JLPWWSCZ24PI57NI6FD3FVLF5AVCNFSM6AAAAACAARHE62VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTGNJZHA3DQNY>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
|
We have working agent investment now 🥳; closing |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Question mega-thread
I thought it would be easier to start a new discussion for all these questions so they're in one place.
Agent investment
time_slice_levelor something?mothball_yearsmodel parameter #649) but it's not obvious to me whether it's essential to the model or we could do it after the MVP. What do you think?Other business
Beta Was this translation helpful? Give feedback.
All reactions