I am trying to create a system for building a mesh on a bezier curve. Everything seems to have worked out, but when trying to calculate the UV coordinates, oddities occur: in places of strong bending, this appears:

I thought it was because the segments on the curve were of different lengths. Using the code from here: Subdivide Bezier Curves i divided the curve into equal segments, but the result did not change in any way.
But I noticed that when the segments on the curve increase, the texture becomes normal:

Here is the code for building a mesh:
for (int j = 0; j < numSegmentsPerCurve; j++)
{
float t = j / (float)numSegmentsPerCurve;
Vector3 point = Bezier.GetPosition(p0, p1, p2, p3, t);
pointsForDraw.Add(point);
Vector3 point_derivative = Bezier.GetDerivative(p0, p1, p2, p3, t);
Vector3 point_tangent = Vector3.Cross(point_derivative, Vector3.up).normalized;
float half_width = roadWidth * 0.5f;
var vert1 = point - point_tangent * half_width;
var vert2 = point + point_tangent * half_width;
v += 0.1f;
vertices.Add(vert1);
vertices.Add(vert2);
uvs.Add(new Vector2(0.0f, v * size_tex));
uvs.Add(new Vector2(1.0f, v * size_tex));
}
No way to solve my problem without increasing the segments on the curve bend?
PS: Thanks for the help, null! I guess I really need to use this article. But, I'm still new to shaders. I tried to rewrite the shader that is listed in the article:
Shader "Custom/test"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
}
SubShader
{
Tags { "RenderType"="Opaque" }
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
float4x4 g_matLocalToClip;
Texture2D _MainTex;
SamplerState sampler_MainTex;
struct VertexData
{
float3 pos : POSITION;
float3 uvq : TEXCOORD0;
};
void vert( VertexData vtx, out float3 uvq : TEXCOORD0, out float4 posClip : SV_Position)
{
posClip = mul(float4(vtx.pos, 1.0), g_matLocalToClip);
uvq = vtx.uvq;
}
void frag( float3 uvq : TEXCOORD0, out half4 o_rgba : SV_Target)
{
o_rgba = _MainTex.Sample(sampler_MainTex, uvq.xy / uvq.z);
}
ENDCG
}
}
}
But when the material of this shader is applied to the grid,happens:

I think I'm doing something wrong. Could it be because I'm using Unity URP? Could you point out the mistake to me?
As pointed out by JonasH in a comment, this is a result of the texture mapping that interpolates within triangles, not quadrilaterals. You always have that problem with a trapezoid.
This seems to be a common issue for road builders in unity, see MCrafterzz' [HELP] Giving a productionaly generated symmetric mesh stretched the mesh in a weird way
Hello, I’m currently working on a free road creation tool for Unity and have run into a uv problem. I hope this is the right forum category. Here you can see:
jatkinson_unity's Warped texture on a non square quad is a nice minimal example of the problem:
I’m generating a “non square” quad mesh and applying a texture to it. Unity will split the quad into triangles, and then the middle of the texture is at the halfway point of the edge between the 2 triangles, which I think is “Correct” but also obviously not what I’m looking for.
List<Vector3> verticies = new List<Vector3>(); List<Vector2> uvs = new List<Vector2>(); List<int> indices = new List<int>(); verticies.Add(new Vector3(0.25f, 0.0f, 0.0f)); verticies.Add(new Vector3(0.0f, 1.0f, 0.0f)); verticies.Add(new Vector3(1.0f, 1.0f, 0.0f)); verticies.Add(new Vector3(0.75f, 0.0f, 0.0f)); uvs.Add(new Vector2(0.0f, 0.0f)); uvs.Add(new Vector2(0.0f, 1.0f)); uvs.Add(new Vector2(1.0f, 1.0f)); uvs.Add(new Vector2(1.0f, 0.0f)); indices.Add(0); indices.Add(1); indices.Add(2); indices.Add(3); mesh.SetVertices(verticies); mesh.SetUVs(0, uvs); mesh.SetIndices(indices.ToArray(), MeshTopology.Quads, 0);
Which Bunny83 explains, emphasis mine:
What you are looking at is just affine texture mapping. Technically Unity (or your gpu) does perform perspective correct mapping, however since it’s not perspective that is causing the shrinking on one side, the GPU doesn’t get any hints how it should be corrected as every triangle is rendered on its own.
Severin's question How to control UV mapping distortion has a nice illustration of reality vs. expectation (go there, upvote):

Ok, so that's the problem. Don't cut your ear of, here's what you can do about it:
I really dig null's answer, it's a great explanation of what's going on!
Here is how I attacked it: by using more than 2 dimensions for the UV vector! You can use the 3rd and 4th coordinates to make interpolation work properly, since the shader will interpolate over ALL the coordinates.
Obviously, this operation depends upon the shape of the quad, so you'd need those corner positions also. Here is a function that will take regular 2d UV coordinates for a quad, and convert them to 4d for you. (warning: no guarantees on this code- been a long time since I've touched it.)
/// <summary>
/// Given 4 quad vertices, and a reference to their current uv values: adjusts the uv values to reduce affine distortion
/// </summary>
/// <param name="verts">the 4 verts of the quad</param>
/// <param name="uvs">the 4 uv coordinates of the quad, once for each vertex. These ref values are adjusted by this function.</param>
public void DeAffineQuadUVS(Vector3[] verts, ref Vector4[] uvs)
{
// 1 - 2
// | / |
// 0 - 3
//DO UV.X
Vector3 edge = (verts[0] - verts[3]);
Vector3 oppositeEdge = (verts[1] - verts[2]);
float edgeLen = edge.magnitude;
float oppositeEdgeLen = oppositeEdge.magnitude;
float edgeLengthRatio = 1.0f;
float inverseRatio = 1.0f;
float cosOfAngleBetweenEdges;
if (!Mathf.Approximately(edgeLen, oppositeEdgeLen))
{
cosOfAngleBetweenEdges = Vector3.Dot(oppositeEdge / oppositeEdgeLen, edge / edgeLen);
edgeLengthRatio = edgeLen / oppositeEdgeLen;
edgeLengthRatio = Mathf.Pow(edgeLengthRatio, (1.0f-cosOfAngleBetweenEdges) + 0.5f);//yields the sqrt(edgeLengthRatio) = edgeLengthRatio^0.5 when there is no angle between edges. approaches edgeLengthRatio^1.5 when edges approach perpendicular
inverseRatio = 1.0f / (edgeLengthRatio);
if (!Mathf.Approximately(edgeLengthRatio, 1.0f))
{
uvs[0].x *= edgeLengthRatio;
uvs[0].z = edgeLengthRatio;
uvs[3].x *= edgeLengthRatio;
uvs[3].z = edgeLengthRatio;
uvs[1].x *= inverseRatio;
uvs[1].z = inverseRatio;
uvs[2].x *= inverseRatio;
uvs[2].z = inverseRatio;
}
}
// 1 - 2
// | / |
// 0 - 3
//DO UV.Y
edge = (verts[0] - verts[1]);
oppositeEdge = (verts[2] - verts[3]);
edgeLen = edge.magnitude;
oppositeEdgeLen = oppositeEdge.magnitude;
if (!Mathf.Approximately(edgeLen, oppositeEdgeLen))
{
cosOfAngleBetweenEdges = Vector3.Dot(oppositeEdge / oppositeEdgeLen, edge / edgeLen);
edgeLengthRatio = edgeLen / oppositeEdgeLen;
edgeLengthRatio = Mathf.Pow(edgeLengthRatio, (1.0f - cosOfAngleBetweenEdges) + 0.5f);
inverseRatio = 1.0f / (edgeLengthRatio);
if (!Mathf.Approximately(edgeLengthRatio, 1.0f))
{
uvs[0].y *= edgeLengthRatio;
uvs[0].w = edgeLengthRatio;
uvs[1].y *= edgeLengthRatio;
uvs[1].w = edgeLengthRatio;
uvs[3].y *= inverseRatio;
uvs[3].w = inverseRatio;
uvs[2].y *= inverseRatio;
uvs[2].w = inverseRatio;
}/// end ratio!=1;
}
}// end deAffine
Once your mesh specifies 4D uv coordinates, to actually USE these additional coordinate values, you'll need to adjust your shader as well.
float4 frag(vertexOutput input) : COLOR
{
return tex2D(_MainTex, float2(input.uv.xy) / float2(input.uv.zw));
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With