-
Notifications
You must be signed in to change notification settings - Fork 105
ref(profile-chunks): Move profile chunks to new processing pipeline #5505
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
Dav1dde
wants to merge
1
commit into
master
Choose a base branch
from
dav1d/profile-chunks-processing
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+369
−181
Open
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,22 @@ | ||
| use relay_dynamic_config::Feature; | ||
|
|
||
| use crate::processing::Context; | ||
| use crate::processing::profile_chunks::{Error, Result}; | ||
|
|
||
| /// Checks whether the profile ingestion feature flag is enabled for the current project. | ||
| pub fn feature_flag(ctx: Context<'_>) -> Result<()> { | ||
| let feature = match ctx | ||
| .project_info | ||
| .has_feature(Feature::ContinuousProfilingBetaIngest) | ||
| { | ||
| // Legacy feature. | ||
| true => Feature::ContinuousProfilingBeta, | ||
| // The post release ingestion feature. | ||
| false => Feature::ContinuousProfiling, | ||
| }; | ||
|
|
||
| match ctx.should_filter(feature) { | ||
| true => Err(Error::FilterFeatureFlag), | ||
| false => Ok(()), | ||
| } | ||
| } | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,189 @@ | ||
| use std::sync::Arc; | ||
|
|
||
| use relay_profiling::ProfileType; | ||
| use relay_quotas::{DataCategory, RateLimits}; | ||
|
|
||
| use crate::Envelope; | ||
| use crate::envelope::{EnvelopeHeaders, Item, ItemType, Items}; | ||
| use crate::managed::{Counted, Managed, ManagedEnvelope, ManagedResult as _, Quantities, Rejected}; | ||
| use crate::processing::{self, Context, CountRateLimited, Forward, Output, QuotaRateLimiter}; | ||
| use crate::services::outcome::{DiscardReason, Outcome}; | ||
| use smallvec::smallvec; | ||
|
|
||
| mod filter; | ||
| mod process; | ||
|
|
||
| pub type Result<T, E = Error> = std::result::Result<T, E>; | ||
|
|
||
| #[derive(Debug, thiserror::Error)] | ||
| pub enum Error { | ||
| /// Error raised in [`relay_profiling`]. | ||
| #[error("Profiling Error: {0}")] | ||
| Profiling(#[from] relay_profiling::ProfileError), | ||
| /// The profile chunks are rate limited. | ||
| #[error("rate limited")] | ||
| RateLimited(RateLimits), | ||
| /// Profile chunks filtered because of a missing feature flag. | ||
| #[error("profile chunks feature flag missing")] | ||
| FilterFeatureFlag, | ||
| } | ||
|
|
||
| impl From<RateLimits> for Error { | ||
| fn from(value: RateLimits) -> Self { | ||
| Self::RateLimited(value) | ||
| } | ||
| } | ||
|
|
||
| impl crate::managed::OutcomeError for Error { | ||
| type Error = Self; | ||
|
|
||
| fn consume(self) -> (Option<Outcome>, Self::Error) { | ||
| let outcome = match &self { | ||
| Self::Profiling(relay_profiling::ProfileError::Filtered(f)) => { | ||
| Some(Outcome::Filtered(f.clone())) | ||
| } | ||
| Self::Profiling(err) => Some(Outcome::Invalid(DiscardReason::Profiling( | ||
| relay_profiling::discard_reason(err), | ||
| ))), | ||
|
|
||
| Self::RateLimited(limits) => { | ||
| let reason_code = limits.longest().and_then(|limit| limit.reason_code.clone()); | ||
| Some(Outcome::RateLimited(reason_code)) | ||
| } | ||
| Self::FilterFeatureFlag => None, | ||
| }; | ||
| (outcome, self) | ||
| } | ||
| } | ||
|
|
||
| /// A processor for profile chunks. | ||
| /// | ||
| /// It processes items of type: [`ItemType::ProfileChunk`]. | ||
| #[derive(Debug)] | ||
| pub struct ProfileChunksProcessor { | ||
| limiter: Arc<QuotaRateLimiter>, | ||
| } | ||
|
|
||
| impl ProfileChunksProcessor { | ||
| /// Creates a new [`Self`]. | ||
| pub fn new(limiter: Arc<QuotaRateLimiter>) -> Self { | ||
| Self { limiter } | ||
| } | ||
| } | ||
|
|
||
| impl processing::Processor for ProfileChunksProcessor { | ||
| type UnitOfWork = SerializedProfileChunks; | ||
| type Output = ProfileChunkOutput; | ||
| type Error = Error; | ||
|
|
||
| fn prepare_envelope( | ||
| &self, | ||
| envelope: &mut ManagedEnvelope, | ||
| ) -> Option<Managed<Self::UnitOfWork>> { | ||
| let profile_chunks = envelope | ||
| .envelope_mut() | ||
| .take_items_by(|item| matches!(*item.ty(), ItemType::ProfileChunk)) | ||
| .into_vec(); | ||
|
|
||
| if profile_chunks.is_empty() { | ||
| return None; | ||
| } | ||
|
|
||
| Some(Managed::from_envelope( | ||
| envelope, | ||
| SerializedProfileChunks { | ||
| headers: envelope.envelope().headers().clone(), | ||
| profile_chunks, | ||
| }, | ||
| )) | ||
| } | ||
|
|
||
| async fn process( | ||
| &self, | ||
| mut profile_chunks: Managed<Self::UnitOfWork>, | ||
| ctx: Context<'_>, | ||
| ) -> Result<Output<Self::Output>, Rejected<Error>> { | ||
| filter::feature_flag(ctx).reject(&profile_chunks)?; | ||
|
|
||
| process::process(&mut profile_chunks, ctx); | ||
|
|
||
| let profile_chunks = self.limiter.enforce_quotas(profile_chunks, ctx).await?; | ||
|
|
||
| Ok(Output::just(ProfileChunkOutput(profile_chunks))) | ||
| } | ||
| } | ||
|
|
||
| /// Output produced by [`ProfileChunksProcessor`]. | ||
| #[derive(Debug)] | ||
| pub struct ProfileChunkOutput(Managed<SerializedProfileChunks>); | ||
|
|
||
| impl Forward for ProfileChunkOutput { | ||
| fn serialize_envelope( | ||
| self, | ||
| _: processing::ForwardContext<'_>, | ||
| ) -> Result<Managed<Box<Envelope>>, Rejected<()>> { | ||
| let Self(profile_chunks) = self; | ||
| Ok(profile_chunks | ||
| .map(|pc, _| Envelope::from_parts(pc.headers, Items::from_vec(pc.profile_chunks)))) | ||
| } | ||
|
|
||
| #[cfg(feature = "processing")] | ||
| fn forward_store( | ||
| self, | ||
| s: processing::forward::StoreHandle<'_>, | ||
| ctx: processing::ForwardContext<'_>, | ||
| ) -> Result<(), Rejected<()>> { | ||
| use crate::services::store::StoreProfileChunk; | ||
|
|
||
| let Self(profile_chunks) = self; | ||
| let retention_days = ctx.event_retention().standard; | ||
|
|
||
| for item in profile_chunks.split(|pc| pc.profile_chunks) { | ||
| s.store(item.map(|item, _| StoreProfileChunk { | ||
| retention_days, | ||
| payload: item.payload(), | ||
| quantities: item.quantities(), | ||
| })); | ||
| } | ||
|
|
||
| Ok(()) | ||
| } | ||
| } | ||
|
|
||
| /// Serialized profile chunks extracted from an envelope. | ||
| #[derive(Debug)] | ||
| pub struct SerializedProfileChunks { | ||
| /// Original envelope headers. | ||
| pub headers: EnvelopeHeaders, | ||
| /// List of serialized profile chunk items. | ||
| pub profile_chunks: Vec<Item>, | ||
| } | ||
|
|
||
| impl Counted for SerializedProfileChunks { | ||
| fn quantities(&self) -> Quantities { | ||
| let mut ui = 0; | ||
| let mut backend = 0; | ||
|
|
||
| for pc in &self.profile_chunks { | ||
| match pc.profile_type() { | ||
| Some(ProfileType::Ui) => ui += 1, | ||
| Some(ProfileType::Backend) => backend += 1, | ||
| None => {} | ||
Dav1dde marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| } | ||
| } | ||
Dav1dde marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| let mut quantities = smallvec![]; | ||
| if ui > 0 { | ||
| quantities.push((DataCategory::ProfileChunkUi, ui)); | ||
| } | ||
| if backend > 0 { | ||
| quantities.push((DataCategory::ProfileChunk, backend)); | ||
| } | ||
|
|
||
| quantities | ||
| } | ||
| } | ||
|
|
||
| impl CountRateLimited for Managed<SerializedProfileChunks> { | ||
| type Error = Error; | ||
| } | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,68 @@ | ||
| use relay_profiling::ProfileType; | ||
| use relay_quotas::DataCategory; | ||
|
|
||
| use crate::envelope::{ContentType, Item, ItemType}; | ||
| use crate::processing::Context; | ||
| use crate::processing::Managed; | ||
| use crate::processing::profile_chunks::{Result, SerializedProfileChunks}; | ||
|
|
||
| /// Processes profile chunks. | ||
| pub fn process(profile_chunks: &mut Managed<SerializedProfileChunks>, ctx: Context<'_>) { | ||
| // Only run this 'expensive' processing step in processing Relays. | ||
| if !ctx.is_processing() { | ||
| return; | ||
| } | ||
|
|
||
| let client_ip = profile_chunks.headers.meta().client_addr(); | ||
| let filter_settings = &ctx.project_info.config.filter_settings; | ||
|
|
||
| profile_chunks.retain( | ||
| |pc| &mut pc.profile_chunks, | ||
| |item, records| -> Result<()> { | ||
| let pc = relay_profiling::ProfileChunk::new(item.payload())?; | ||
|
|
||
| // Validate the item inferred profile type with the one from the payload, | ||
| // or if missing set it. | ||
| // | ||
| // This is currently necessary to ensure profile chunks are emitted in the correct | ||
| // data category, as well as rate limited with the correct data category. | ||
| // | ||
| // In the future we plan to make the profile type on the item header a necessity. | ||
| // For more context see also: <https://github.com/getsentry/relay/pull/4595>. | ||
| if item | ||
| .profile_type() | ||
| .is_some_and(|pt| pt != pc.profile_type()) | ||
| { | ||
| return Err(relay_profiling::ProfileError::InvalidProfileType.into()); | ||
| } | ||
|
|
||
| // Update the profile type to ensure the following outcomes are emitted in the correct | ||
| // data category. | ||
| // | ||
| // Once the item header on the item is required, this is no longer required. | ||
| if item.profile_type().is_none() { | ||
| item.set_profile_type(pc.profile_type()); | ||
| match pc.profile_type() { | ||
| ProfileType::Ui => records.modify_by(DataCategory::ProfileChunkUi, 1), | ||
| ProfileType::Backend => records.modify_by(DataCategory::ProfileChunk, 1), | ||
| } | ||
| } | ||
|
|
||
| pc.filter(client_ip, filter_settings, ctx.global_config)?; | ||
|
|
||
| let expanded = pc.expand()?; | ||
| if expanded.len() > ctx.config.max_profile_size() { | ||
| return Err(relay_profiling::ProfileError::ExceedSizeLimit.into()); | ||
| } | ||
|
|
||
| *item = { | ||
| let mut item = Item::new(ItemType::ProfileChunk); | ||
| item.set_profile_type(pc.profile_type()); | ||
| item.set_payload(ContentType::Json, expanded); | ||
| item | ||
| }; | ||
|
|
||
| Ok(()) | ||
| }, | ||
| ); | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.