vault backup: 2024-11-26 18:16:19
This commit is contained in:
16
02-Note/ASoul/动画相关/Sequaio.md
Normal file
16
02-Note/ASoul/动画相关/Sequaio.md
Normal file
@@ -0,0 +1,16 @@
|
||||
|
||||
# Motion写入逻辑
|
||||
|
||||
正确导入Motion的Log
|
||||
```c++
|
||||
[2024.10.23-05.22.35:360][637]LogSequoia: SequoiaFileRefPool:: Load FileRef start ++++++:C:/LiveDirectorSaved/Sequoia/心宜思诺一周年/LIKE THAT.Sequoia/1D20B3CA4D4B5CED1D7312AE0D9EBF9F.motion
|
||||
```
|
||||
|
||||
错误
|
||||
```c++
|
||||
LogSequoia: SequoiaData file ref load complete, sequoiaPath = :/Sequoia/心宜思诺一周年/初智齿.Sequoia/初智齿.json
|
||||
```
|
||||
|
||||
## 录制逻辑
|
||||
LogSequoia: UMotionCaptureRecorder::StartRecord start record motion frames from avatar:Idol.F07
|
||||
LogSequoia: UMotionCaptureRecorder::StopRecordstop record motion frames from avatar:Idol.F07, frames:0
|
45
02-Note/ASoul/动画相关/动捕&面捕.md
Normal file
45
02-Note/ASoul/动画相关/动捕&面捕.md
Normal file
@@ -0,0 +1,45 @@
|
||||
# 测试流程
|
||||
1. 需要手机与电脑处于同一个网段。
|
||||
2. 设置MotionServer Ip设置、ARKitNetConfig.ini、MotionNetConfig.ini、MotionNetConfig2.ini
|
||||
3. 打开FaceMask,设置角色名称、电脑IP并且点连接。
|
||||
4. 打开MotionProcess,设置角色名称并且点连接。
|
||||
|
||||
**可以直接打开Map_MotionProcess进行开发与测试。**
|
||||
## Editor测试方式
|
||||
GM_TsLiveDirectorGameMode => PC_TsDirectorController=> BP_MotionSender0 BP_MotionReceiver0
|
||||
|
||||
# IdolAnimInstance
|
||||
UpdateAnimation每帧执行PrepareMocapParameters(),会获取TsMotionRetargetComponent的引用(正常情况会获取IdolActor的Controller中的TsMotionRetargetComponent。
|
||||
|
||||
TsMotionRetargetComponent,包含TsChingmuMocapReceiverActor => ChingmuMocapReceiverActor
|
||||
|
||||
# 相关动画节点
|
||||
## AnimNode_FullBody
|
||||
青瞳的动捕数据通过**AnimNode_FullBody**节点进行接收。具体是通过AMotionReceiverActor接收逻辑。
|
||||
|
||||
## AnimNode_FacialExpression
|
||||
FaceMask面捕节点。
|
||||
|
||||
但具体的数据接收是在TsMediaPipeMocapReceiverActor与TsMotionRetargetComponent。
|
||||
|
||||
### FacialExpressionConfigAsset
|
||||
用于设置表情各种数据。所有角色的表情资产位于`Content/LiveDirector/FaceExpressionConfig`。
|
||||
|
||||
比较关键的曲线映射,也就是将Arkit面捕数据从一个BlendShape0~1映射成5个对应的blendShape,这样做到更加细腻的表情效果。比如tongueOut =>
|
||||
tongueOut_1
|
||||
tongueOut_2
|
||||
tongueOut_3
|
||||
tongueOut_4
|
||||
tongueOut_5
|
||||
|
||||
BlendShape Maya源文件位于
|
||||
## HandPoseAnimNode(调整手部Pose?)
|
||||
FName HandPoseDataTablePath = TEXT("DataTable'/Game/ResArt/HandPose/DT_HandPoseConfig.DT_HandPoseConfig'");
|
||||
|
||||
# 相关Actor
|
||||
- AMotionReceiverActor:动捕数据接收。
|
||||
- AMediaPipeMocapReceiverActor:面捕数据接收。
|
||||
|
||||
## AMediaPipeMocapReceiverActor
|
||||
1. (AMediaPipeMocapReceiverActor)Tick => OnGetMediaPipeData() => **(TsMediaPipeSkeleton)Skeleton.OnGetMediaPipeData(Data)** ,这个函数逻辑在TsMediaPipeMocapReceiverActor。
|
||||
2. (TsMediaPipeMocapReceiverActor)ReceiveTick() => UpdateAnimation() 对数据进行过滤调整之后,将**面捕数据塞入AnimNode_FacialExpression**。
|
462
02-Note/ASoul/动画相关/动捕逻辑.md
Normal file
462
02-Note/ASoul/动画相关/动捕逻辑.md
Normal file
@@ -0,0 +1,462 @@
|
||||
# 相关类
|
||||
- TsArkitDataReceiver(ArkitDataReceiver)
|
||||
- TsChingmuMocapReceiverActor(ChingmuMocapReceiverActor)
|
||||
- TsMotionReceiverActor(MotionReceiverActor) => BP_MotionReceiver:定义了MotionNetConfig.ini。
|
||||
- TsMotionSenderActor(MotionSenderActor)
|
||||
|
||||
# TsChingmuMocapReceiverActor
|
||||
***地图里只会有一个生成的TsChingmuMocapReceiverActor来管理动捕数据接收***
|
||||
1. Init():在Server才会Spawn TsChingmuMocapReceiverActor。
|
||||
2. ConnectChingMu():**ChingmuComp.StartConnectServer()**
|
||||
3. Multicast_AligmMotionTime():寻找场景中的BP_MotionReceiver,并且调用Receiver.AlignTimeStamp()。
|
||||
|
||||
## ChingmuMocapReceiverActor
|
||||
核心逻辑:
|
||||
- ***FChingmuThread::Run()***
|
||||
- ***AChingmuMocapReceiverActor::Tick()***
|
||||
- AChingmuMocapReceiverActor::DoSample()
|
||||
|
||||
```c++
|
||||
void AChingmuMocapReceiverActor::BeginPlay()
|
||||
{
|
||||
Super::BeginPlay();
|
||||
MaxHumanCount = 10;
|
||||
MaxRigidBodyCount = 10;
|
||||
CacheLimit = 240;
|
||||
SampledHumanData = NewObject<UMocapFrameData>();
|
||||
ThreadInterval = 0.002;
|
||||
BackIndexCount = int64(UMotionUtils::BackSampleTime / (1000.0 / CHINGMU_SERVER_FPS));//BackSampleTime = 100ms CHINGMU_SERVER_FPS =120ms
|
||||
ChingmuComp = Cast<UChingMUComponent>(GetComponentByClass(UChingMUComponent::StaticClass()));
|
||||
if (ChingmuComp == nullptr)
|
||||
{
|
||||
UE_LOG(LogTemp, Error, TEXT("Chingmu Component is missing!!"));
|
||||
}
|
||||
Thread = new FChingmuThread("Chingmu Data Thread", this);
|
||||
Sender = GetMotionSender();
|
||||
}
|
||||
```
|
||||
|
||||
FChingmuThread::Run()中处理完[[#ST_MocapFrameData]]之后,将几个演员动补数据存入FrameQueue之后。在Tick()出队,之后数据存入AllHumanFrames/AllRigidBodyFrames。
|
||||
|
||||
- AllHumanFrames
|
||||
- ID
|
||||
- std::vector<ST_MocapFrameData*> Frames
|
||||
- ID
|
||||
- TimeStamp
|
||||
- FrameIndex
|
||||
- BonesWorldPos
|
||||
- BonesLocalRot
|
||||
|
||||
```c++
|
||||
void AChingmuMocapReceiverActor::Tick(float DeltaTime)
|
||||
{
|
||||
Super::Tick(DeltaTime);
|
||||
|
||||
if(!Sender)
|
||||
{
|
||||
Sender = GetMotionSender();
|
||||
}
|
||||
const auto CurTime = ULiveDirectorStatics::GetUnixTime();//获取当前系统时间
|
||||
if(UseThread)
|
||||
{
|
||||
// 线程方式
|
||||
// 在数据队列中获取青瞳数据
|
||||
while (!FrameQueue.IsEmpty())//处理完所有
|
||||
{
|
||||
ST_MocapFrameData* Frame;
|
||||
if (FrameQueue.Dequeue(Frame))//出队
|
||||
{
|
||||
PutMocapDataIntoFrameList(Frame);//将帧数数据塞入对应HuamnID/RigidBodyID的AllHumanFrames/AllRigidBodyFrames中。
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
DoSample(AllHumanFrames);
|
||||
DoSample(AllRigidBodyFrames);
|
||||
|
||||
// 每隔1s计算一次平均包间隔
|
||||
if (CurTime - LastCheckIntervalTime > 1000)
|
||||
{
|
||||
if (AllHumanFrames.Num() > 0)
|
||||
{
|
||||
AllHumanFrames[0]->CalculatePackageAverageInterval(this->PackageAverageInterval);
|
||||
LastCheckIntervalTime = CurTime;
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 采样相关逻辑
|
||||
- ***SampleByTimeStamp***()
|
||||
```c++
|
||||
void AChingmuMocapReceiverActor::DoSample(TArray<MocapFrames*>& Frames)
|
||||
{
|
||||
for (auto i = 0; i < Frames.Num(); i++)
|
||||
{
|
||||
Frames[i]->CheckSize(CacheLimit);//判断当前帧数据是否超过指定长度(240帧,2~4秒数据),移除超出长度的数据。
|
||||
if (SampleByTimeStamp(Frames[i]->Frames))//对数据进行插值,当前插值数据存在SampledHumanData。
|
||||
{
|
||||
SendFrameToCharacter();//执行对应的TsChingmuMocapReceiverActor.ts中的逻辑,主要是触发一个事件,讲数据传递给TsMotionRetargetComponent.ts 或者 TsSceneLiveLinkPropActor.ts(动捕道具)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class MocapFrames
|
||||
{
|
||||
public:
|
||||
int ID;
|
||||
std::vector<ST_MocapFrameData*> Frames = {};
|
||||
|
||||
public:
|
||||
MocapFrames(): ID(0)
|
||||
{
|
||||
}
|
||||
|
||||
bool CheckSize(const int Limit)
|
||||
{
|
||||
if (Frames.size() > Limit)
|
||||
{
|
||||
const int DeletedCount = Frames.size() / 2;
|
||||
for (auto i = 0; i < DeletedCount; i++)
|
||||
{
|
||||
auto Data = Frames[i];
|
||||
if (Data)
|
||||
{
|
||||
delete Data;
|
||||
}
|
||||
Data = nullptr;
|
||||
}
|
||||
Frames.erase(Frames.cbegin(), Frames.cbegin() + DeletedCount);
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
对数据进行插值,当前插值数据存在**SampledHumanData**。
|
||||
```c++
|
||||
bool AChingmuMocapReceiverActor::SampleByTimeStamp(std::vector<ST_MocapFrameData*>& DataList)
|
||||
{
|
||||
const int64 SampleTime = ULiveDirectorStatics::GetUnixTime() - UMotionUtils::BackSampleTime;//UMotionUtils::BackSampleTime = 100ms,采样100ms的数据。
|
||||
int Previous = -1;
|
||||
int Next = -1;
|
||||
for (int Index = DataList.size() - 1; Index > 0; Index--)//从Last => First遍历所有数据,确定插值的2个数据Index。
|
||||
{
|
||||
const ST_MocapFrameData* Data = DataList[Index];
|
||||
if (Data == nullptr)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
if (Data->TimeStamp - SampleTime > 0)
|
||||
{
|
||||
Next = Index;
|
||||
}
|
||||
else
|
||||
{
|
||||
Previous = Index;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (bShowSampleLog)
|
||||
{
|
||||
UE_LOG(LogTemp, Warning, TEXT("prev: %d, next: %d, total: %llu"), Previous, Next, DataList.size());
|
||||
}
|
||||
if (Previous != -1 && Next != -1)
|
||||
{
|
||||
const auto p = DataList[Previous];
|
||||
const auto n = DataList[Next];
|
||||
const float Factor = (n->TimeStamp - p->TimeStamp) > 0
|
||||
? (1.0 * (SampleTime - p->TimeStamp) / (n->TimeStamp - p->TimeStamp))
|
||||
: 1.0;
|
||||
// Bone world pos cannot lerp like this
|
||||
// It will cause bone length changes all the time
|
||||
SampledHumanData->ID = p->ID;
|
||||
SampledHumanData->TimeStamp = SampleTime;
|
||||
SampledHumanData->FrameIndex = p->FrameIndex;
|
||||
for (auto Index = 0; Index < 23; Index++)//对23个骨骼进行差值。
|
||||
{
|
||||
SampledHumanData->BonesWorldPos[Index] = UKismetMathLibrary::VLerp(
|
||||
p->BonesWorldPos[Index], n->BonesWorldPos[Index], Factor);
|
||||
SampledHumanData->BonesLocalRot[Index] = UKismetMathLibrary::RLerp(p->BonesLocalRot[Index].Rotator(),
|
||||
n->BonesLocalRot[Index].Rotator(),
|
||||
Factor, true).Quaternion();
|
||||
}
|
||||
return true;
|
||||
}
|
||||
if (Previous != -1)//容错处理,全都是Previous,数据太旧直接清空。
|
||||
{
|
||||
SampledHumanData->CopyFrom(DataList[Previous]);
|
||||
|
||||
if(SampleTime - DataList[Previous]->TimeStamp > UMotionUtils::MotionTimeout)
|
||||
{
|
||||
// data is too old, clear the data list.
|
||||
DataList.clear();
|
||||
}
|
||||
return true;
|
||||
}
|
||||
if (Next != -1)//没有Previous,直接复制Next的数据。
|
||||
{
|
||||
SampledHumanData->CopyFrom(DataList[Next]);
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
```
|
||||
|
||||
### FChingmuThread
|
||||
用途为:
|
||||
- 获取当前系统时间。
|
||||
- 使用异步Task的方式,通过调用**UChingMUComponent::FullBodyMotionCapBaseBonesLocalSpaceRotation()** 来更新每个演员的动捕数据。动捕数据存储在**ChingMUComponent**中的***LocalRotationList***、***GlobalLocationList***中。
|
||||
- 管理HumanToLastReceiveTime,以此管理每个动捕演员的动画数据时长。
|
||||
- OwnerActor->OnGetHumanData_NotInGameThread(),
|
||||
- 根据当前时间与当前Frames,从UChingMUComponent中将数据复制到[[#ST_MocapFrameData]]中。
|
||||
- 将[[#ST_MocapFrameData]]转换成JSON后,使用AMotionSenderActor::OnGetRawMocapData_NotInGameThread()发送。
|
||||
- 将当前帧数据加入FrameQueue队列。
|
||||
- 线程睡眠0.001s。以此保证AChingmuMocapReceiverActor::Tick()中可以把数据都处理完。
|
||||
|
||||
```c++
|
||||
uint32 FChingmuThread::Run()
|
||||
{
|
||||
FTransform Tmp;
|
||||
while (bRun)
|
||||
{
|
||||
if (OwnerActor && OwnerActor->UseThread && OwnerActor->ChingmuComp && OwnerActor->ChingmuComp->IsConnected())
|
||||
{
|
||||
CurTime = ULiveDirectorStatics::GetUnixTime();
|
||||
// Human
|
||||
for (auto HumanIndex = 0; HumanIndex < OwnerActor->MaxHumanCount; HumanIndex++)
|
||||
{
|
||||
const auto bRes = OwnerActor->ChingmuComp->FullBodyMotionCapBaseBonesLocalSpaceRotation(
|
||||
OwnerActor->ChingmuFullAddress, HumanIndex, TmpTimeCode);
|
||||
if (bRes)
|
||||
{
|
||||
if (!HumanToLastReceiveTime.Contains(HumanIndex))//空数据处理。
|
||||
{
|
||||
HumanToLastReceiveTime.Add(HumanIndex, 0);
|
||||
}
|
||||
if (HumanToLastReceiveTime[HumanIndex] != TmpTimeCode.Frames)//判断是否收到新的Frame数据
|
||||
{
|
||||
HumanToLastReceiveTime[HumanIndex] = TmpTimeCode.Frames;
|
||||
OwnerActor->OnGetHumanData_NotInGameThread(HumanIndex, CurTime, TmpTimeCode.Frames);
|
||||
}
|
||||
else
|
||||
{
|
||||
// get same frame, skip
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Rigidbody
|
||||
|
||||
for (auto RigidBodyIndex = OwnerActor->RigidBodyStartIndex; RigidBodyIndex < OwnerActor->RigidBodyStartIndex
|
||||
+ OwnerActor->MaxRigidBodyCount; RigidBodyIndex++)
|
||||
{
|
||||
OwnerActor->ChingmuComp->GetTrackerPoseTC(OwnerActor->ChingmuFullAddress, RigidBodyIndex, Tmp,
|
||||
TmpTimeCode);
|
||||
|
||||
if (!RigidBodyToLastReceiveTransform.Contains(RigidBodyIndex))
|
||||
{
|
||||
RigidBodyToLastReceiveTransform.Add(RigidBodyIndex, FTransform::Identity);
|
||||
}
|
||||
// 道具的TmpTimeCode.Frames永远为0,所以无法用帧数判断
|
||||
// 改为transform判断
|
||||
if (!RigidBodyToLastReceiveTransform[RigidBodyIndex].Equals(Tmp))
|
||||
{
|
||||
RigidBodyToLastReceiveTransform[RigidBodyIndex] = Tmp;
|
||||
OwnerActor->OnGetRigidBodyData_NotInGameThread(RigidBodyIndex, Tmp, CurTime, TmpTimeCode.Frames);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (bRun)
|
||||
{
|
||||
FPlatformProcess::Sleep(OwnerActor ? OwnerActor->ThreadInterval : 0.004);
|
||||
}
|
||||
else
|
||||
{
|
||||
break;
|
||||
}
|
||||
}
|
||||
UE_LOG(LogTemp, Warning, TEXT("%s finish work."), *ThreadName)
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
## ST_MocapFrameData
|
||||
- ST_MocapFrameData为动捕数据的原始帧数据。
|
||||
|
||||
```c++
|
||||
#define MOCAP_BONE_COUNT 23
|
||||
|
||||
enum E_MotionType
|
||||
{
|
||||
Human,
|
||||
RigidBody
|
||||
};
|
||||
|
||||
enum E_SourceType
|
||||
{
|
||||
Mocap,
|
||||
CMR,
|
||||
Replay
|
||||
};
|
||||
|
||||
struct ST_MocapFrameData
|
||||
{
|
||||
int ID;
|
||||
int64 TimeStamp;
|
||||
int FrameIndex;
|
||||
E_MotionType MotionType;
|
||||
E_SourceType SourceType;
|
||||
FVector BonesWorldPos[MOCAP_BONE_COUNT];
|
||||
FQuat BonesLocalRot[MOCAP_BONE_COUNT];
|
||||
};
|
||||
|
||||
class LIVEDIRECTOR_API UMocapFrameData : public UObject
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
public:
|
||||
UPROPERTY(BlueprintReadWrite, EditAnywhere)
|
||||
int ID;
|
||||
UPROPERTY(BlueprintReadWrite, EditAnywhere)
|
||||
TArray<FVector> BonesWorldPos;
|
||||
UPROPERTY(BlueprintReadWrite, EditAnywhere)
|
||||
TArray<FQuat> BonesLocalRot;
|
||||
UPROPERTY(BlueprintReadWrite, EditAnywhere)
|
||||
int64 TimeStamp;
|
||||
UPROPERTY(BlueprintReadWrite, EditAnywhere)
|
||||
int FrameIndex;
|
||||
UPROPERTY(BlueprintReadWrite, EditAnywhere)
|
||||
int MotionType; // 0 human; 1 rigidbody
|
||||
UPROPERTY(BlueprintReadWrite, EditAnywhere)
|
||||
int SourceType; // 0 mocap, 1 cmr
|
||||
public:
|
||||
void CopyFrom(const ST_MocapFrameData* Other)
|
||||
{
|
||||
ID = Other->ID;
|
||||
TimeStamp = Other->TimeStamp;
|
||||
FrameIndex = Other->FrameIndex;
|
||||
MotionType = Other->MotionType;
|
||||
SourceType = Other->SourceType;
|
||||
for (auto Index = 0; Index < 23; Index++)
|
||||
{
|
||||
BonesWorldPos[Index] = Other->BonesWorldPos[Index];
|
||||
BonesLocalRot[Index] = Other->BonesLocalRot[Index];
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
class MocapFrames
|
||||
{
|
||||
public:
|
||||
int ID;
|
||||
std::vector<ST_MocapFrameData*> Frames = {};
|
||||
|
||||
void CalculatePackageAverageInterval(float& Res)
|
||||
{
|
||||
if(Frames.size() > 0)
|
||||
{
|
||||
auto First = Frames[0];
|
||||
auto Last = Frames[Frames.size() - 1];
|
||||
if(Last->FrameIndex > First->FrameIndex)
|
||||
{
|
||||
Res = 1.0 * (Last->TimeStamp - First->TimeStamp) / (Last->FrameIndex - First->FrameIndex);
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
# MotionCapture(青瞳插件)
|
||||
实现1个组件与3个动画节点:
|
||||
- [[#ChingMUComponent]]:
|
||||
- [[#AnimNode_ChingMUPose]]:接受骨骼动捕数据。
|
||||
- [[#AnimNode_ChingMURetargetPose]]:接受重定向后的骨骼动捕数据。
|
||||
- AnimNode_ChingMURetargetPoseForBuild:
|
||||
|
||||
## ***ChingMUComponent***
|
||||
1. Init
|
||||
1. BeginPlay():取得ini文件中的配置信息;取得当前角色的SkeletonMesh => CharacterSkinMesh;取得BoneName=>BoneIndex Map、TPose状态下骨骼的旋转值、TposeParentBonesRotation。
|
||||
2. Connect
|
||||
1. StartConnectServer():motionCapturePlugin->ConnectCommand = "ConnectServer"。具体逻辑会在FMotionCapture::Tick()处理。
|
||||
2. DisConnectServer():motionCapturePlugin->ConnectCommand = "DisConnect"。
|
||||
3. [[#CalculateBoneCSRotation()]]
|
||||
4. [[#FullBodyMotionCapBaseBonesLocalSpaceRotation]]
|
||||
|
||||
### CalculateBoneCSRotation
|
||||
> Get Human Fullbody Tracker data ,including of 23joints localRotation and root joint world Position
|
||||
|
||||
1. m_motioncap->CMHuman():调用DLL的CMHumanExtern(),获取一个Double数组,前3个是RootLocation,后面全是Rotation。
|
||||
2. 计算最终的四元数旋转值。
|
||||
3. 返回的形参 FQuat* BonesComponentSpaceRotation,数组指针。
|
||||
|
||||
### FullBodyMotionCapBaseBonesLocalSpaceRotation
|
||||
相比CalculateBoneCSRotation,增加了时间码以及GlobalLocation的动捕数据获取。
|
||||
1. m_motioncap->CMHuman():调用DLL的CMHumanExtern(),获取一个Double数组,前3个是RootLocation,后面全是Rotation。
|
||||
2. motionCapturePlugin->CMHumanGlobalRTTC():调用DLL的CMHumanGlobalRTTC(),1-24 New Features。计算**VrpnTimeCode**以及**GlobalLocationList**。
|
||||
|
||||
数据存在**ChingMUComponent**中的***LocalRotationList***、***GlobalLocationList***。
|
||||
## FAnimNode_ChingMUPose
|
||||
1. Initialize_AnyThread():取得**ChingMUComponent**。
|
||||
2. Update_AnyThread():调用**ChingMUComponent->CalculateBoneCSRotation()**
|
||||
3. Evaluate_AnyThread():对23根骨骼进行遍历;取得RefPose后,将从Update_AnyThread()获得动捕数据(**Rotation**)覆盖到上面(ComponentSpace),**根骨骼需要额外添加Location数据**。最后将数据从ComponentSpace => LocalSpace。
|
||||
|
||||
## AnimNode_ChingMURetargetPose
|
||||
1. Initialize_AnyThread():创建曲线逻辑(TCHour、TCMinute、TCSecond、TCFrame)。
|
||||
2. Update_AnyThread():
|
||||
3. Evaluate_AnyThread():相关逻辑都实现在这里。
|
||||
|
||||
### AnimNode_ChingMURetargetPose::Evaluate_AnyThread()
|
||||
|
||||
# TsMotionReceiverActor
|
||||
只在BeginPlay()中调用了this.MarkAsClientSeamlessTravel(); 具体逻辑在`AMotionReceiverActor`
|
||||
|
||||
## MotionReceiverActor
|
||||
|
||||
![[动捕逻辑思维导图.canvas]]
|
||||
|
||||
|
||||
# Config与BoneName相关逻辑
|
||||
1. Config/FullBodyConfig.json储存了对应的骨骼名称、Morph以及RootMotion骨骼名称。
|
||||
1. 通过 UMotionUtils::GetModelBones()、UMotionUtils::GetMoveableBones()、UMotionUtils::GetMorphTargets()获取名称数组。
|
||||
2. GetModelBones()
|
||||
1. 主要在FAnimNode_FullBody::Initialize_AnyThread()被调用。
|
||||
2. 填充`TArray<FBoneReference> BoneRefList;`,顺带初始化SampledFullBodyData。
|
||||
3. InitBoneRefIndex(),初始化BoneRefList中每个FBoneReference的BoneIndex(通过骨骼名称找到),如果没有找到会提示对应的Log。
|
||||
4. FAnimNode_FullBody::Evaluate_AnyThread(),作用在[[#ApplyDataToPose()]]。
|
||||
3. GetMorphTargets()
|
||||
1. 主要在FAnimNode_FullBody::Initialize_AnyThread()被调用。
|
||||
|
||||
## ApplyDataToPose()
|
||||
### BoneTransform
|
||||
1. 遍历BoneRefList(从UMotionUtils::GetModelBones()获得)
|
||||
2. 对BoneIndex有效的骨骼进行一些操作。
|
||||
1. 取得当前动画蓝图输出Pose的**骨骼Index**以及**采样后动捕数据的旋转值**。
|
||||
2. 如果骨骼名是Hips,就将当前Index设置给HipsIndex。
|
||||
3. 将旋转值应用到OutputPose中。
|
||||
4. 判断当前骨骼名是否为MoveableBones中的名称,将这些骨骼的Location设置到OutputPose中。
|
||||
|
||||
### MorphValues
|
||||
将对应MorphTarget数据应用到对应的CurveChannel上。
|
||||
### RootMotion
|
||||
根据bUseHipsTranslation变量执行不同的逻辑:
|
||||
|
||||
#### MapTranslationToHips
|
||||
调用函数形参如下:
|
||||
```c++
|
||||
MapTranslationToHips(Output, EvaluatedFullBodyData, 0, HipsIndex);
|
||||
```
|
||||
|
||||
1. 获取Joints骨骼的Locaiton作为RootMotion数据
|
||||
2. 判断Joints骨骼是不是根骨骼,如果**是**则调整RootMotion数据轴向。
|
||||
3. 将Joints骨骼的Location归零。
|
||||
4. 如果Hips骨骼有效,则将RootMotion数据加到其Location上。
|
||||
|
||||
#### ExtractRootMotionInfo
|
||||
1. 获取Joints骨骼的Locaiton作为RootMotion数据。
|
||||
2. 判断Joints骨骼是不是根骨骼,如果**不是**则调整RootMotion数据轴向。(**轴向与MapTranslationToHips()不同**)
|
||||
3. 将Joints骨骼的Location归零。
|
||||
4. 将RootMotion设置给AnimInstance的RootMotionLocation。
|
||||
5. 如果Hips骨骼有效,进行一堆计算,最终将Rotation设置AnimInstance的RootMotionRotation。
|
17
02-Note/ASoul/动画相关/动捕逻辑思维导图.canvas
Normal file
17
02-Note/ASoul/动画相关/动捕逻辑思维导图.canvas
Normal file
@@ -0,0 +1,17 @@
|
||||
{
|
||||
"nodes":[
|
||||
{"id":"2666bc7c541cb485","type":"text","text":"FChingmuThread::Run()\n\n发送数据\nOnGetHumanData_NotInGameThread() => PutMocapDataIntoQueue => Sender->OnGetRawMocapData_NotInGameThread(jsonStr);\n\n```c++\nwhile (bRun)\n{\n\tif (OwnerActor && OwnerActor->UseThread && OwnerActor->ChingmuComp && OwnerActor->ChingmuComp->IsConnected())\n\t{\n\t\tCurTime = ULiveDirectorStatics::GetUnixTime();\n\t\t// Human\n\t\tfor (auto HumanIndex = 0; HumanIndex < OwnerActor->MaxHumanCount; HumanIndex++)\n\t\t{\n\t\t\tconst auto bRes = OwnerActor->ChingmuComp->FullBodyMotionCapBaseBonesLocalSpaceRotation(\n\t\t\t\tOwnerActor->ChingmuFullAddress, HumanIndex, TmpTimeCode);\n\t\t\tif (bRes)\n\t\t\t{\n\t\t\t\tif (!HumanToLastReceiveTime.Contains(HumanIndex))\n\t\t\t\t{\n\t\t\t\t\tHumanToLastReceiveTime.Add(HumanIndex, 0);\n\t\t\t\t}\n\t\t\t\tif (HumanToLastReceiveTime[HumanIndex] != TmpTimeCode.Frames)\n\t\t\t\t{\n\t\t\t\t\tHumanToLastReceiveTime[HumanIndex] = TmpTimeCode.Frames;\n\t\t\t\t\tOwnerActor->OnGetHumanData_NotInGameThread(HumanIndex, CurTime, TmpTimeCode.Frames);\n\t\t\t\t}\n\t\t\t\telse\n\t\t\t\t{\n\t\t\t\t\t// get same frame, skip\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\n\t}\n\tif (bRun)\n\t{\n\t\tFPlatformProcess::Sleep(OwnerActor ? OwnerActor->ThreadInterval : 0.004);\n\t}\n\telse\n\t{\n\t\tbreak;\n\t}\n}\n\n```","x":-600,"y":-420,"width":980,"height":1180},
|
||||
{"id":"c5705d4ff792be0b","type":"text","text":"**ChingmuComp.StartConnectServer()** 在UI界面控制链接服务器。\nAChingmuMocapReceiverActor::BeginPlay()创建FChingmuThread。","x":-360,"y":-640,"width":500,"height":140},
|
||||
{"id":"668c865498842d96","type":"text","text":"AChingmuMocapReceiverActor::Tick()\n\n```c++\nconst auto CurTime = ULiveDirectorStatics::GetUnixTime();\nif(UseThread)\n{\n\t// 线程方式\n\t// 在数据队列中获取青瞳数据\n\twhile (!FrameQueue.IsEmpty())\n\t{\n\t\tST_MocapFrameData* Frame;\n\t\tif (FrameQueue.Dequeue(Frame))\n\t\t{\n\t\t\tPutMocapDataIntoFrameList(Frame);\n\t\t}\n\t}\n}\n\nDoSample(AllHumanFrames);\nDoSample(AllRigidBodyFrames);\n\n// 每隔1s计算一次平均包间隔\nif (CurTime - LastCheckIntervalTime > 1000)\n{\n\tif (AllHumanFrames.Num() > 0)\n\t{\n\t\tAllHumanFrames[0]->CalculatePackageAverageInterval(this->PackageAverageInterval);\n\t\tLastCheckIntervalTime = CurTime;\n\t}\n}\n```","x":-600,"y":820,"width":980,"height":800},
|
||||
{"id":"04df15f334d740f3","type":"text","text":"IdolAnimInstance & Anim_FullBody\n\nIdolAnimInstance:主要是取得场景中的**AMotionReceiverActor**以及设置身份。\nAnim_FullBody:\n\n```c++\nvoid FAnimNode_FullBody::Update_AnyThread(const FAnimationUpdateContext& Context)\n{\n\tSourcePose.Update(Context);\n\tEMotionSourceType MotionSourceType = EMotionSourceType::MST_MotionServer;\n\tconst UIdolAnimInstance* IdolAnimInstance = Cast<UIdolAnimInstance>(\n\t\tContext.AnimInstanceProxy->GetAnimInstanceObject());\n\tif (IdolAnimInstance)\n\t{\n\t\tMotionSourceType = IdolAnimInstance->GetMotionSourceType();\n\t}\n\tif (MotionSourceType == EMotionSourceType::MST_MotionServer)\n\t{\n\t\tconst FString ValidIdentity = GetFullBodyIdentity(Context);\n\t\tconst auto Recv = GetMotionReceiver(Context);\n\t\tif (!ValidIdentity.IsEmpty() && Recv.IsValid())\n\t\t{\n\t\t\tbGetMotionData = Recv->SampleFullBodyData_AnimationThread(ValidIdentity,\n\t\t\t ULiveDirectorStatics::GetUnixTime() -\n\t\t\t UMotionUtils::BackSampleTime * 2,\n\t\t\t SampledFullBodyData);\n\t\t}\n\t}\n}\n\nvoid FAnimNode_FullBody::Evaluate_AnyThread(FPoseContext& Output)\n{\n\tSourcePose.Evaluate(Output);\n\tif (!InitializedBoneRefIndex)\n\t{\n\t\tInitBoneRefIndex(Output);\n\t\tInitializedBoneRefIndex = true;\n\t}\n\tEMotionSourceType MotionSourceType = EMotionSourceType::MST_MotionServer;\n\tconst UIdolAnimInstance* IdolAnimInstance = Cast<UIdolAnimInstance>(\n\t\tOutput.AnimInstanceProxy->GetAnimInstanceObject());\n\tif (IdolAnimInstance)\n\t{\n\t\tMotionSourceType = IdolAnimInstance->GetMotionSourceType();\n\t}\n\n\tFMotionFrameFullBodyData& EvaluatedFullBodyData = SampledFullBodyData;\n\n\tswitch (MotionSourceType)\n\t{\n\tcase EMotionSourceType::MST_MotionServer:\n\t\tif (!bGetMotionData)\n\t\t{\n\t\t\treturn;\n\t\t}\n\t\tEvaluatedFullBodyData = SampledFullBodyData;\n\t\tbreak;\n\tcase EMotionSourceType::MST_SequoiaReplay:\n\t\t{\n\t\t\t// Evaluate from sequoia source.\n\t\t\tconst FSequoiaMotionSource& MotionSource = FSequoiaMotionSource::Get();\n\t\t\tconst FString ValidIdentity = GetFullBodyIdentity(Output);\n\t\t\tif (const FMotionFrameFullBodyData* FrameSnapshot = MotionSource.EvaluateFrame_AnyThread(ValidIdentity))\n\t\t\t{\n\t\t\t\tEvaluatedFullBodyData = *FrameSnapshot;\n\t\t\t\tbGetMotionData = true;\n\t\t\t}\n\t\t\telse\n\t\t\t{\n\t\t\t\tUE_LOG(LogTemp, Warning, TEXT(\"%s No Sequoia Frame Data found.AvatarName=%s\"),\n\t\t\t\t ANSI_TO_TCHAR(__FUNCTION__), *ValidIdentity)\n\t\t\t\tbGetMotionData = false;\n\t\t\t\treturn;\n\t\t\t}\n\t\t}\n\n\t\tbreak;\n\tdefault:\n\t\tbreak;\n\t}\n\n\tApplyDataToPose(Output, EvaluatedFullBodyData);\n}\n```","x":-960,"y":1720,"width":1700,"height":2080},
|
||||
{"id":"778e83e66edd5118","x":-903,"y":3980,"width":1586,"height":197,"type":"text","text":"bool AMotionReceiverActor::SampleFullBodyData_AnimationThread()\n1. 对CharacterToFrameList里的角色数据进行采样,并将采样数据存储到SampledFullBodyData中。\n2. CharacterToFrameList的数据会在接收到网络传递的逻辑后填充,ASimpleUDPReceiverActor::OnReceiveData_NetworkThread() => ProcessReceivedData_NetworkThread => PutFrameIntoQueue_NetworkThread() "},
|
||||
{"id":"521dba38cdd6c593","x":-460,"y":4300,"width":700,"height":120,"type":"text","text":"FMotionFrameFullBodyData& EvaluatedFullBodyData = SampledFullBodyData;\nApplyDataToPose(Output, EvaluatedFullBodyData);"}
|
||||
],
|
||||
"edges":[
|
||||
{"id":"b6e4d43c4c38cf16","fromNode":"2666bc7c541cb485","fromSide":"bottom","toNode":"668c865498842d96","toSide":"top"},
|
||||
{"id":"34998812ac1bd8a8","fromNode":"c5705d4ff792be0b","fromSide":"bottom","toNode":"2666bc7c541cb485","toSide":"top"},
|
||||
{"id":"2e063b7710fd9a81","fromNode":"668c865498842d96","fromSide":"bottom","toNode":"04df15f334d740f3","toSide":"top"},
|
||||
{"id":"ddef3dd868ca08bf","fromNode":"04df15f334d740f3","fromSide":"bottom","toNode":"778e83e66edd5118","toSide":"top","label":"Update_AnyThread"},
|
||||
{"id":"037baa41a3eb9866","fromNode":"778e83e66edd5118","fromSide":"bottom","toNode":"521dba38cdd6c593","toSide":"top","label":"Evaluate_AnyThread"}
|
||||
]
|
||||
}
|
29
02-Note/ASoul/动画相关/动画蓝图逻辑.md
Normal file
29
02-Note/ASoul/动画相关/动画蓝图逻辑.md
Normal file
@@ -0,0 +1,29 @@
|
||||
# 手部IK逻辑
|
||||
主要用于设置**一些道具配套的手部姿势并且限制演员做出一些NG手势**。具体逻辑位于ControlRig XXX中。里面需要传入一些HandIKTarget Transform,这里以吉他为例,首先相关计算从载入道具开始到RefreshInstrumentIK为止:
|
||||
- LoadPropByConfig =>
|
||||
- CheckPropPose=>
|
||||
- TriggerInstrumentPose=>
|
||||
- TriggerInstrumentIK
|
||||
- RefreshInstrumentIK
|
||||
|
||||
# 重定向相关
|
||||
逻辑主要分为TsRetargetManagerComponent以及动画蓝图蓝图中ControlRig。
|
||||
- MotionProcess端会走重定向逻辑。
|
||||
- 其他客户端会接受MotionProcess => MotionServer广播的Motion数据。
|
||||
|
||||
## TsRetargetManagerComponent
|
||||
该组件会计算当前角色骨骼与标准的Human骨骼的比例,以此计算出一些用于重定向的数据,并且开启重定向中的PostProcess:
|
||||
- ModelScale
|
||||
- LegScale
|
||||
- HipDiff
|
||||
|
||||
## ControlRig
|
||||
ControlRig中有一个Mocap骨骼与角色骨骼,所有控制器都在Mocap骨骼上。
|
||||
1. 接收动捕数据,并且将数据设置到Mocap骨骼骨骼上。
|
||||
2. PostProcess。
|
||||
3. 除Hip外的骨骼设置Rotation到角色骨骼上,Hips只设置Transform。
|
||||
4. 后处理。
|
||||
5. 将Hips骨骼数据传递到Joints上。
|
||||
|
||||
# IK问题记录
|
||||
处理平跟与高跟:NaiLin_ControlRig_Heel /Game/ResArt/CharacterArt/NaiLin/ControlRig/NaiLin_ControlRig_Heel
|
Reference in New Issue
Block a user