Can I add non-trees to terrain.terrainData.treeInstances on Unity 3D?

I need to create a brush in which there will be stones, poles, bushes and other elements of the environment. I bought a Prefab Brush. Whether it is possible to add the objects that I draw on the Prefab Brush scene to terrainData.treeInstances, as regular trees add?

Is it possible to use a not prefab tree as an object for the TreeInstance class?

UNITY – Instantiate is spawning too many clones

just at a bit of an impasse with this one if anyone can help.

I am developing an endless runner, and I am trying to instantiate my obstacle prefabs by having my player go through a trigger which activates an instantiation. The problem I’m having is that when the player goes through the trigger, my code seems to instantiate multiple prefabs at a time. I’ll post screenshots and my codes to illustrate this better. Player hits the first trigger, instantiating 3 clones

Player hits the next trigger which instantiating even more clones

using System.Collections; using System.Collections.Generic; using UnityEngine;  public class ObsGenerator : MonoBehaviour {      public GameObject spawnTrigger;     public GameObject Player;     [SerializeField] private List<Transform> obstacleList;      public Transform spawnLocation;      public void OnTriggerEnter2D(Collider2D Player)     {         if (Player.CompareTag("Respawn"))         {            // for (int i = 0; i < obstacleList.Count; i++)             {                 spawnObstacle();                              }         }     }      public void spawnObstacle()     {         for (int i = 0; i < obstacleList.Count; i++)              if (obstacleList.Count > 0)             {                 Transform chosenLevelPart = obstacleList[Random.Range(0, obstacleList.Count)];                 Instantiate(chosenLevelPart);             }         }     }

Really appreciate anybody’s help!

Unity Game too dark on Samsung Galaxy Alpha (LWRP)

I just noticed that my game is way too dark on my old Samsung Galaxy Alpha. The game looks fine in the editor, PC version and on my Samsung Galaxy S7.

I’ve been developing with Unity 2019.2.2f1 so far and read that some versions had this problem I have in older versions, but that it should have been patched already. I upgraded to 2019.2.15f1 just out of desperation, but the problem remains the same.

It’s a LWRP project and I have no clue about lighting yet. There’s no lights added to the project so far and I’m mainly using GUI elements in the game instead of sprites.

View post on imgur.com

View post on imgur.com

I don’t have much experience with quality settings and such stuff, so I’m not sure which settings to post. Please, let me know what sections are important and I’ll edit the post accordingly.

Setting SteamVR Camera Target Eye before SteamVR installed in Unity

I’m writing a plugin that can be used in both VR and non-VR projects. I have a Camera that is used to display UI containing some game info on a secondary monitor. My problem is that when the end-user adds my plugin and then imports steam vr, this UI camera’s Target Eye is set to “Both” by default.

I can’t have any SteamVR code in my plugin. But I can’t adjust Target eye property of the camera without it. I just want it so that the camera doesn’t change when a user imports SteamVR. I want it to always render to the main display (target eye = none)

So just to clarify,

  1. I have my plugin, written in a non-vr project with a standard Unity camera. There is no Target Eye setting for this standard camera.
  2. User wants to use it in VR project, imports plugin into SteamVR Project
  3. Camera used to render UI gets automatically converted (wrongly) into SteamVR camera
  4. Defaults to Target Eye = Both
  5. Need it to be Target Eye = None

At the moment to avoid users submitting countless bug reports, I have to provide warnings in like 6 places telling users to dig way into my built-in prefabs to adjust this setting manually if they use SteamVR. It’s not easy for new Unity users, and I want to make adoption as easy as possible.

Unity 2D: Can’t resolve ArgumentOutOfRangeException: Index was out of range

I am very much a novice to coding with no background on this at all so the code you see here may be terrible.

Basically I’m making an endless runner, made up of a player character and a set of obstacle prefabs. The obstacle prefabs have a trigger on them so that when the player hits it, it spawns the next prefab. I’m using the code below, and when the player hits the first spawn trigger it will spawn a prefab, but on tripping the next trigger it gives me the ArgumentOutOfRangeException error in the console and doesn’t spawn anything.

public class ObsTrigger : MonoBehaviour {

 public GameObject spawnTrigger;  [SerializeField] private List<Transform> obstacleList;   private void OnTriggerEnter2D(Collider2D other)  {      if (other.CompareTag("Player"))      {          for (int i = 0; i < obstacleList.Count; i++)          {              spawnObstacle();          }      }  }   public void spawnObstacle()  {      {          Transform chosenLevelPart = obstacleList[Random.Range(0, obstacleList.Count -1)];          Instantiate(chosenLevelPart);      }  } 

}

If anyone can point me in the right direction I would really appreciate it!

Vibration in Unity

I would like to adjust the vibration for my game. The standard Hanheld.Vibrate () method doesn’t give a good result. Is there any way to control the duration and strength of the vibration?

I’ve heard about some kind of plugin. I can try to create it myself, but I need a way.Thank you!

Can I use the 2d physics engine in a 3d game (or viceversa) in Unity?

This is entirely for performance. The 2D physics are less expensive, but I require 3D for some scenes. I never need both at the same time. I know you can have 2D with an orthographic perspective in a 3D engine, but what I want is really the physics engine. Also, is there a way of turning off these engines? I´ve made most collisions from scratch and am only using them for some raycasts at the beginning and for some collider/rigidbody.casts in not every, but a lot of frames (If I understand correctly, they are calculated from the physics engine in each FixedUpdate()).

Applying movement to a Rigid Body Isometric in Unity 3D

Im making an isometric dungeon crawler. I initially used transform to get isometric movement and things worked well, but i couldnt use collision which meant it was unsuitable. So im now trying to use Rigidbody and im having some weird issues. First of all, this is the code i currently have

public class CharControllerRigid : MonoBehaviour{     [SerializeField]     private Rigidbody characterRigid;     private Vector3 inputVector;     void Start()     {         characterRigid = GetComponent<Rigidbody>();        }      void Update()     {         inputVector = new Vector3(Input.GetAxis("Horizontal") * 10f, characterRigid.velocity.y, Input.GetAxisRaw("Vertical"));         transform.LookAt(transform.position + new Vector3(inputVector.x, 0, inputVector.z));           }     private void FixedUpdate()     {         characterRigid.velocity = inputVector;     } } 

So i have two issues, how do i get the movement to work on an isometric plain? And secondly how do i get the movement to work in all directions? Current when i move left and right the movement is good, but up and down are really slow in comparison.

How to manage game-object script component and values in new updated model in unity

I have a large object that contains so many gameobject (the fbx) and it attached several mono behaviour script with different values assigned publicly in inspector. Now the problem is each time we update the model in project (FBX in project) we have to drag and drop the model/FBX again. It means that we have to attach all the scripts again with the relevant values/Data. I am currently looking for right way to do this job. Currently I place both new and old fbx in the scene and then one by one i copy paste the old object’s script component into new object. Then i delete the old one object/model/fbx.

Note: I have bring the fbx again in hirarachy because sometime the object not properly update in the scene.

Move Unity NavMeshAgent at a constant speed

My NavMeshAgent randomly changes speed while moving along it’s path. It seems to be slower when a segment of a path is shorter (between two waypoints/corners) or when there are many close continuous corners ahead. I have tried changing the acceleration and angular speed but that didn’t work. I have also tried changing the agent.velocity and even though I set it to vectors of the same magnitude each frame, it still doesn’t move at a constant speed. I have also checked if everything is on the same Y coordinates since my game is top down orthographic.